Binocular stereo matching method

A binocular stereo matching and matching cost technology, applied in the field of computer vision, can solve the problems of complex extraction network, reduced running time, and too many parameters

Pending Publication Date: 2022-03-11
北京师范大学珠海校区
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] In order to overcome the shortcomings of the existing feature extraction network that is too complex and too many parameters, the purpose of the present invention is to provide a binocular stereo matching method, which reduces the operating pressure of the equipment through the improved binocular stereo matching method, and faster Obtain an initial disparity map with good quality. The innovation of this method is to propose an end-to-end deep network suitable for stereo matching, in which the network structure is built with an end-to-end framework design. After inputting the processed left and right images, the left and right The image mainly refers to the imaging using a parallel binocular vision system, such as a binocular camera, or the left and right images obtained by two parallel cameras. The disparity map can be directly obtained through the network, reducing the manual preprocessing and subsequent processing. The time is reduced, and by reducing the error accumulation in the stereo matching process, at the same time, on the basis of AANet, a feature extraction network with fewer parameters and a simpler feature extraction network is constructed to obtain better local features, and then through the intra-scale aggregation module and The inter-scale aggregation module uses deformable convolution and traditional cross-scale aggregation methods to enable the network to efficiently obtain high-quality disparity maps

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Binocular stereo matching method
  • Binocular stereo matching method
  • Binocular stereo matching method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0065] The present invention proposes an end-to-end network suitable for stereo matching. Firstly, it introduces the data set and training process used in the network training of the method of the present invention. Then, each model is evaluated through different settings, and the advantages of the proposed network in terms of time and memory usage are confirmed by the performance of the Sceneflow test set. Finally, it fine-tunes and verifies its algorithm performance on the KITTI2012 and KITTI2015 datasets.

[0066] 1. Dataset and training process

[0067] The present invention first uses the Sceneflow data set to train the network. The data sets commonly used in the previous stereo matching algorithms, such as KITTI and Middlebury, have fewer training images. In order to improve the performance of the stereo matching network, especially the end-to-end stereo matching The network requires a large amount of data for training. In 2016, CVPR (ComputerVision and Pattern Recognit...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a binocular stereo matching method, which comprises the following steps of: 1) performing feature extraction on a left image and a right image to be matched by a feature extraction network to obtain feature maps of N resolutions of the left image and feature maps of N resolutions corresponding to the right image; 2) performing related operation on the feature maps of the same resolution corresponding to the left and right images to form a 4D matching cost volume; performing local cost aggregation operation on each 4D matching cost volume through an intra-scale aggregation module to obtain a brand new matching cost volume with the same resolution as the original matching cost volume; 3) fusing the N 4D brand-new matching cost volumes obtained in the step 2) through an inter-scale aggregation module to obtain a final matching cost volume; 4) obtaining disparity maps corresponding to N different resolutions according to the final matching cost volume; and then carrying out up-sampling on the obtained disparity maps and inputting the obtained disparity maps into the StereoDRNet to obtain a final prediction disparity map.

Description

technical field [0001] The invention relates to the field of computer vision, in particular to a binocular stereo matching method. Background technique [0002] Binocular stereo vision is inspired by human vision. Without touching the target, it uses two cameras to capture images from different angles by imitating the human eye, and obtains the three-dimensional information of the object according to the principle of parallax to reconstruct the three-dimensional outline of the object. And location information, and has the advantages of high efficiency, high precision, simple operation, low cost, high degree of automation, etc., widely used in precision measurement, target positioning, robot motion and environmental survey, mechanical grasping, intelligent driving, 3D reconstruction, Medical imaging, human-computer interaction and other fields. [0003] The key technologies for the realization of binocular vision include: camera calibration, stereo correction, stereo matchin...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/73G06T7/80G06N3/04G06N3/08
CPCG06T7/73G06T7/85G06N3/08G06T2207/10012G06T2207/20081G06T2207/20084G06T2207/30244G06N3/045
Inventor 杨戈廖雨婷
Owner 北京师范大学珠海校区
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products