Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Sparse disparity acquisition method based on target edge features and gray similarity

An edge feature and acquisition method technology, applied in the field of stereo vision, can solve the problems of loss of information, recovery of ambiguity of 3D scene information, and low correct matching rate.

Active Publication Date: 2016-04-06
BEIJING INSTITUTE OF TECHNOLOGYGY
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The three-dimensional space scene forms a two-dimensional plane image through the perspective projection of the camera, and a large amount of information is lost in the process of projection from the high-dimensional space to the low-dimensional space, resulting in the ambiguity of restoring the three-dimensional scene information from the plane image
[0005] The binocular stereo matching methods for the purpose of restoring the attitude information of the target object in the scene all use the sparse matching method. The common problem of the existing sparse matching algorithms is that the correct matching rate is not high.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Sparse disparity acquisition method based on target edge features and gray similarity
  • Sparse disparity acquisition method based on target edge features and gray similarity
  • Sparse disparity acquisition method based on target edge features and gray similarity

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0054] In order to better illustrate the purpose and advantages of the present invention, it will be described below in conjunction with specific embodiments.

[0055] The method of the present invention first uses the left image as a reference, and the right image as a matching search image for matching, and utilizes edge feature similarity constraints to eliminate non-corresponding points as soon as possible; then, utilizes a similarity detection method based on grayscale and grayscale Census transformation to find The best matching point. Finally, the right image will be used as a reference, and the left image will be used as a matching search image to perform cross-checking on the matching results to eliminate false matching points and further improve the accuracy of matching. The specific process is as follows figure 2 shown.

[0056] The structure of the sparse stereo matching device designed according to the method of the present invention is as follows figure 1 As s...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a sparse parallax obtaining method based on target edge features and gray scale similarity and belongs to the technical field of stereoscopic vision. According to the method, the image edge features are extracted by means of the Canny image edge detection method, a strong edge feature point of a target in an image is chosen as a point to be matched, feature similarity and the gray scale similarity of the image are combined, a similarity detecting method of fusion of gray scale and gray scale Census change is adopted, error matching points are eliminated by means of cross detection, and accuracy of matching is further improved. The sparse parallax obtaining method based on the target edge features and the gray scale similarity provides necessary sparse parallax information for recovery of the three-dimensional position and attitude information of an object, has the advantages of being high in accuracy, high in matching speed and stable in matching, and can effectively provides the necessary sparse parallax information for binocular vision system measurement especially active measurement of the attitude information of the object.

Description

technical field [0001] The invention relates to a sparse parallax acquisition method based on target edge features and gray-scale similarity, and belongs to the technical field of stereo vision. Background technique [0002] Stereo matching is a key step in recovering scene 3D information in binocular stereo vision. The three-dimensional space scene forms a two-dimensional plane image through the perspective projection of the camera, and a large amount of information is lost in the process of projecting from the high-dimensional space to the low-dimensional space, resulting in the ambiguity of restoring the three-dimensional scene information from the plane image. Affected by factors such as camera parameters, uneven illumination of the scene, and reflectivity of object surfaces in different directions, corresponding points in the two images cannot have exactly the same grayscale and color characteristics, which brings additional difficulties to restore 3D scene information ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/46
Inventor 刘向东余银刘冰陈振
Owner BEIJING INSTITUTE OF TECHNOLOGYGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products