Binocular stereo matching method based on joint up-sampling convolutional neural network

A binocular stereo matching and convolutional neural network technology, applied in the field of computer vision, can solve problems such as easy loss of fine image structure information, inaccurate prediction of target boundaries or parallax results of small-sized objects, etc. The effect of reducing precision and improving computational efficiency

Active Publication Date: 2020-07-10
XI AN JIAOTONG UNIV
View PDF10 Cites 23 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although the encoder can obtain rich semantic information by successively downsampling the spatial resolution, in the decoding process, the currently commonly used deconvolution upsampling tends to lose fine image structure information, resulting in parallax of object boundaries or small-sized objects Inaccurate prediction of outcome

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Binocular stereo matching method based on joint up-sampling convolutional neural network
  • Binocular stereo matching method based on joint up-sampling convolutional neural network
  • Binocular stereo matching method based on joint up-sampling convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032] The present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments.

[0033] Such as Figure 1-6 As shown, after conventional data preprocessing operations such as scrambling, cropping, and normalization are performed on the original input image, the present invention provides a binocular stereo matching method based on a joint upsampling convolutional neural network, which includes features Three steps of extraction, matching cost aggregation and disparity calculation:

[0034] 1) figure 1 It is a schematic diagram of the overall framework of the present invention. The input of the neural network model to complete the binocular stereo matching task is to match the image pair I 1 and I 2 , the output is the target image I 1 The dense disparity information of , that is, the disparity map D. The network will learn a function (model) f satisfying the following relation:

[0035] f(I 1 , I 2 ) = D

[0036]...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a binocular stereo matching method based on a joint up-sampling convolutional neural network. The method comprises the following steps: firstly, carrying out feature extractionon an input three-dimensional image pair by utilizing a two-dimensional convolutional neural network based on joint up-sampling; constructing an initial three-dimensional matching cost amount of thematching cost by splicing the features of the three-dimensional image pairs, further performing cost aggregation on the matching cost amount by adopting three cascaded three-dimensional convolutionalneural networks based on joint up-sampling, and finally obtaining a dense disparity map with sub-pixel precision by utilizing a regression method. Compared with an existing binocular stereo matching deep neural network, the convolutional neural network based on pyramid joint up-sampling is adopted in the decoding stage of the feature extraction and cost aggregation steps; and by fusing multi-leveland multi-scale context feature information, more detail textures can be effectively reserved in the up-sampling process, the calculation efficiency of the method is improved by adopting depth separable convolution with low calculation complexity, and the disparity map quality of binocular stereo matching is improved.

Description

technical field [0001] The invention belongs to the technical field of computer vision, and in particular relates to a binocular stereo matching method based on a joint upsampling convolutional neural network. Background technique [0002] Binocular stereo matching is a research problem that has attracted much attention in the field of computer vision. It has been widely used in various systems such as 3D reconstruction, automatic driving, autonomous robot navigation, and industrial inspection. In particular, current applications have an urgent need for real-time binocular stereo matching with high precision, high resolution, and large parallax. This undoubtedly poses a higher challenge to the computational efficiency and accuracy of the technology. In recent years, artificial intelligence based on deep learning technology has developed rapidly, and breakthroughs have been made in the fields of object detection, image classification, and speech recognition. Binocular stere...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T3/40G06N3/08G06N3/04G06K9/62
CPCG06T3/4007G06T3/4038G06N3/08G06T2200/32G06N3/045G06F18/22
Inventor 张旭翀孙宏滨戴赫汪航赵永利郑南宁
Owner XI AN JIAOTONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products