Unlock instant, AI-driven research and patent intelligence for your innovation.

Binocular camera occlusion detection method and apparatus

A binocular camera, occlusion detection technology, applied in image enhancement, image analysis, image communication and other directions, can solve the problems of missed detection, false detection, and low detection accuracy, achieve real-time occlusion detection, effective anti-interference, and improve detection accuracy degree of effect

Active Publication Date: 2017-11-14
BLACKSHARK TECH NANCHANG CO LTD
View PDF4 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the depth information acquisition distance of mainstream depth cameras is generally above 0.5m. When the target object is less than 0.5m away from the camera, but does not touch the camera, it will cause false detection.
[0005] To sum up, whether it is the method of background modeling or the method based on feature analysis, there are currently situations where the detection accuracy is low or missed. Therefore, how to provide a method and device for camera occlusion detection with high detection accuracy is currently an issue. The problem

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Binocular camera occlusion detection method and apparatus
  • Binocular camera occlusion detection method and apparatus
  • Binocular camera occlusion detection method and apparatus

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0042] In the step S102, the dual-view image and the disparity map are simultaneously scaled to a target size, preferably to 320*240 in this embodiment. However, for the convenience of image calculation, the pixel value of the adjusted image may be less than 300,000 pixels. Those skilled in the art can know that any other deformations that can be obtained without labor creation are within the scope of this embodiment.

[0043] Further, in this embodiment, it is preferable to define the pixel block size of the rectangular grid as 40*40, mark a total of 24 rectangular grids along the edge, and calculate the average depth of the edge pixel blocks of the two images. Specifically, the depth mean calculation method: first, in the left view, scan from the first point pl to the right view, find the corresponding matching point pr, and then use the H matrix to map pl to pl', and then pr-pl' It is the parallax. The parallax lookup table can get a depth value, and fill the depth value i...

Embodiment 2

[0060] In another preferred embodiment of the present invention, in the step S103 to the step S105, the same kind of characteristic information of each rectangular grid pixel block is preferentially calculated, that is, the gray scale width of all rectangular grid pixel blocks can be uniformly detected first , sharpness, color change rate or depth mean feature information. For the rectangular grid pixel blocks that do not meet the specified threshold range, mark them as invalid rectangular grid pixel blocks, and do not continue to calculate other feature information.

[0061] The calculation sequence in this embodiment is the average depth value, the grayscale width, the definition, and the color change rate. However, those skilled in the art know that this order can be disrupted, and it is not limited to the protection order, as long as the same feature information is calculated first, and then the rectangular grid pixel blocks that meet the specified threshold range are sort...

Embodiment 3

[0063] The difference between this embodiment and Embodiment 1 lies in that the object of the invention is realized by calculating different feature information. Therefore, algorithms with the same information characteristics as those in Embodiment 1 will not be repeated in this embodiment.

[0064]In the step S103 , in this embodiment, at least four feature information need to be calculated, namely gray scale width, definition, skin color and edge. In a preferred embodiment of the present invention, an average brightness value may also be included.

[0065] In the step S104, specifically, it is judged whether the gray scale width value corresponding to the gray histogram of each grid is smaller than the first threshold, and the state array A[N] is recorded; the sharpness coefficient of each grid is judged Whether it is less than the second threshold, record the state array B[N]; judge whether the area of ​​each rectangular grid skin color area is greater than the fifth thres...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention provides a binocular camera occlusion detection method and apparatus. The method comprises the following steps that: two frames of images are simultaneously read from a binocular camera, a template is adopted to match corresponding points of the dual view images, and a dense depth image between the two images is calculated according to a parallax depth curve table; the two images and the depth image are zoomed to a target size; the size of rectangular grid pixel blocks is defined, the number of the pixel blocks is determined, the rectangular grid pixel blocks are sequentially numbered along the edge of an image; the feature information of the rectangular grid pixel blocks is calculated in real time according to the image; whether the feature information of each rectangular grid pixel block is within a specified threshold range is judged; the number of pixel blocks satisfying all specified threshold conditions is calculated, if the number of the pixel blocks is greater than a set occlusion rectangular grid pixel block threshold, it is judged that occlusion occurs; and if the number of the pixel blocks is smaller than or equal to the set occlusion rectangular grid pixel block threshold, it is judged that occlusion does not occur. Since the depth information can effectively resist interference, detection accuracy can be improved.

Description

technical field [0001] Embodiments of the present invention relate to a method and device for detecting occlusion in shooting, and in particular to a method and device for occlusion detection of a binocular camera. Background technique [0002] Portable mobile devices have become an indispensable part of people's daily life. With the increasing development of portable mobile devices, mobile devices with shooting functions are becoming more and more perfect. Now people travel and play, and rarely use digital cameras or simple cameras. More people choose mobile devices with shooting functions. However, when using such devices, fingers often block the lens, resulting in poor photos. [0003] In recent years, there have been many technical solutions in the industry trying to solve this problem. For example, some lens occlusion detection methods use the camera to obtain the RGB background model of the scene, and use the difference between the foreground and background to determi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04N17/00H04N17/02G06K9/46
CPCH04N17/002H04N17/02G06T2207/10141G06V10/44
Inventor 邹超洋贺永刚万美君朱豪
Owner BLACKSHARK TECH NANCHANG CO LTD
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More