Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Stereoscopic vision matching method based on weak edge and texture classification

A technology of stereo vision matching and texture classification, which is applied in image analysis, image data processing, instruments, etc., can solve the problems of difficulty in satisfying the real-time performance of the algorithm, large amount of computation, and low matching accuracy, so as to reduce the computation amount of the algorithm and improve Real-time performance and the effect of improving the accuracy rate

Inactive Publication Date: 2016-05-04
深圳市华和瑞智科技有限公司 +1
View PDF2 Cites 19 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0013] The global algorithm with high matching accuracy often has a very large amount of calculation, which makes it difficult to meet the real-time performance of the algorithm; while the local algorithm with good real-time performance lacks effective optimization for disparity continuity, the matching accuracy is often low; Improving computational efficiency while maintaining accuracy is an important issue faced by stereo matching algorithms.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Stereoscopic vision matching method based on weak edge and texture classification
  • Stereoscopic vision matching method based on weak edge and texture classification
  • Stereoscopic vision matching method based on weak edge and texture classification

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0113] Referring to the accompanying drawings, the CETR method proposed by the present invention, the CETR algorithm flow process is as follows figure 1 shown.

[0114] The basic idea is to divide the image into edge expansion area and weak texture area according to the edge distribution of the image, and use different methods to estimate the parallax according to the characteristics of the two types of areas, and perform parallax correction or filling according to the results of each other's parallax estimation, and finally After the overall disparity refinement, a complete dense disparity map is obtained.

[0115] Algorithm flow: (1) Perform edge detection on the input corrected color image, and divide the weak texture area and edge expansion area according to the detected edge; (2) Perform edge parallax estimation based on local matching for the edge in the reference image, After matching, the unstable edge parallax value is eliminated by repeatability test and stability t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a stereoscopic vision matching method based on weak edge and texture classification, and solves a technical problem of achieving quick and precise compact matching. The method comprises the steps: carrying out the division and matching of edge and weak texture regions, carrying out parallax error correction and filling, carrying out the refinement of depth edge parallax errors and carrying out compacting of a parallax map. For MADC matching cost, the method carries out the matching according to different properties of regions, carries out mutual parallax error correction or filling according to a matching result, carries out parallax error refinement for a shade region and a parallax discontinuous region, and obtains the complete and compact parallax map. The beneficial effects of the invention are that the method does not need to carry out global optimization, reduces the operation burden of an algorithm, enables a parallax continuity constraint to be matched with the type of the region, and avoids a conflict of common local optimization with the continuity constraint in the prior art. The method is high in universality, and can be widely used for various types of binocular vision and multi-view 3D imaging algorithms.

Description

technical field [0001] The invention relates to the fields of image processing and machine vision, in particular to a stereo vision matching method based on weak edge and texture classification. Background technique [0002] In the process of binocular stereo vision measurement, after obtaining the corresponding relationship of image points through stereo matching, the corresponding disparity map can be obtained, combined with the projection imaging model determined in the calibration process, the depth information of the corresponding point can be determined through calculation. Due to the sparsity and discontinuity of the feature itself, the feature-based stereo matching algorithm cannot directly obtain the corresponding relationship of non-feature pixels, but can only obtain a sparse disparity map, which requires a subsequent complex interpolation process to generate a dense disparity map. If you want to directly obtain a dense disparity map, you need to select a matching...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/00
Inventor 郑可尧
Owner 深圳市华和瑞智科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products