Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Binocular stereo matching method based on convolutional neural network

A technology of binocular stereo matching and convolutional neural network, which is applied in the field of binocular stereo matching based on convolutional neural network, can solve the problems of huge memory consumption and computing processing power, unable to accurately find pixel matching points, etc.

Active Publication Date: 2019-12-03
BEIJING UNIV OF TECH
View PDF3 Cites 46 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, these methods still have some limitations
First, the network model often cannot accurately find the matching points corresponding to pixels in ill-healthy areas such as occluded areas, repeated textures, and reflective surfaces.
Second, existing network operations have huge memory consumption and require powerful computing power

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Binocular stereo matching method based on convolutional neural network
  • Binocular stereo matching method based on convolutional neural network
  • Binocular stereo matching method based on convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0048] The purpose of the present invention is to provide a binocular stereo matching method based on convolutional neural network, which can complete the training of the network end-to-end without any post-processing process, so as to solve the problem of existing stereo matching method based on convolutional neural network. Pathological regions cannot accurately find matching points corresponding to pixels, while significantly reducing memory usage and runtime during training / inference.

[0049] The present invention will be described in detail below in conjunction with the accompanying drawings. It should be noted that the described embodiments are only intended to facilitate the understanding of the present invention, rather than limiting it in any way.

[0050] figure 1 It is the network flowchart of the binocular stereo matching method based on the convolutional neural network provided by the present invention.

[0051] figure 2 It is the network structure diagram of ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a binocular stereo matching method based on a convolutional neural network. For matching cost calculation, context information is integrated by using dense blocks on the basisof initial features. For matching cost aggregation, a regularization cost amount of a small coding and decoding structure is provided. And for parallax calculation, a differentiable soft argmin operation is executed on the parallax dimension of the cost quantity to obtain an initial parallax. For parallax refinement, the initial parallax is guided to be refined by taking a residual block as a mainpart and taking similarity measurement as an auxiliary part. According to the method, four stages of a stereo matching algorithm are strictly followed, and four steps are integrated into one network,so that the network can be trained end to end. According to the stereo matching method, contextual information is integrated in the feature extraction process, mismatching of pixel points in an ill-conditioned area is effectively relieved, memory occupation and running time in the network training / speculation period are remarkably reduced through small and medium coding and decoding structures inthe regularization process, and parallax prediction precision is improved.

Description

technical field [0001] The invention relates to the fields of robot navigation and three-dimensional reconstruction of computer vision, and in particular to a binocular stereo matching method based on a convolutional neural network. Background technique [0002] Depth estimation from stereo image pairs is the core problem of many stereo vision tasks and has applications in many fields, such as 3D reconstruction, autonomous driving, object detection, robot navigation and virtual reality, augmented reality, etc. The purpose of stereo matching is to estimate the correspondence of all pixels between two rectified images. Given a pair of rectified stereo images, the goal of disparity estimation is to compute the disparity d for each pixel in the reference image. Parallax refers to the horizontal displacement between a pair of corresponding points in a reference image and a target image. For a certain pixel of the reference image (x, y), if the corresponding pixel is found at th...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/593
CPCG06T7/593G06T2207/10012G06T2207/20081G06T2207/20084G06T2207/20132G06T2207/20021G06T2207/20228
Inventor 王亮赵长双
Owner BEIJING UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products