Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Binocular Stereo Matching Method Based on Joint Upsampling Convolutional Neural Network

A binocular stereo matching and convolutional neural network technology, applied in the field of computer vision, can solve problems such as easy loss of fine image structure information, inaccurate prediction of target boundaries or parallax results of small-sized objects, etc. The effect of reducing precision and reducing the amount of parameters

Active Publication Date: 2022-03-01
XI AN JIAOTONG UNIV
View PDF10 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although the encoder can obtain rich semantic information by successively downsampling the spatial resolution, in the decoding process, the currently commonly used deconvolution upsampling tends to lose fine image structure information, resulting in parallax of object boundaries or small-sized objects Inaccurate prediction of outcome

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Binocular Stereo Matching Method Based on Joint Upsampling Convolutional Neural Network
  • A Binocular Stereo Matching Method Based on Joint Upsampling Convolutional Neural Network
  • A Binocular Stereo Matching Method Based on Joint Upsampling Convolutional Neural Network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032] The present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments.

[0033] like Figure 1-6 As shown, after conventional data preprocessing operations such as scrambling, cropping, and normalization are performed on the original input image, the present invention provides a binocular stereo matching method based on a joint upsampling convolutional neural network, which includes features Three steps of extraction, matching cost aggregation and disparity calculation:

[0034] 1) figure 1 It is a schematic diagram of the overall framework of the present invention. The input of the neural network model to complete the binocular stereo matching task is to match the image pair I 1 and I 2 , the output is the target image I 1 The dense disparity information of , that is, the disparity map D. The network will learn a function (model) f satisfying the following relation:

[0035] f(I 1 , I 2 ) = D

[0036] Sp...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A binocular stereo matching method based on joint upsampling convolutional neural network, the method first uses the two-dimensional convolutional neural network based on joint upsampling to extract features from the input stereo image pair, and then combines the features of the stereo image pair The initial three-dimensional matching cost volume of the matching cost is constructed, and then three cascaded three-dimensional convolutional neural networks based on joint upsampling are used to aggregate the matching cost volume, and finally a dense disparity map with sub-pixel accuracy is obtained by using the regression method. Compared with the existing binocular stereo matching deep neural network, the present invention adopts a convolutional neural network based on pyramid joint upsampling in the decoding stage of feature extraction and cost aggregation steps, and by fusing multi-level and multi-scale contextual feature information, It can effectively retain more detailed textures in the upsampling process, and use depth-separable convolution with low computational complexity to improve the computational efficiency of the method and improve the quality of the disparity map for binocular stereo matching.

Description

technical field [0001] The invention belongs to the technical field of computer vision, and in particular relates to a binocular stereo matching method based on a joint upsampling convolutional neural network. Background technique [0002] Binocular stereo matching is a research problem that has attracted much attention in the field of computer vision. It has been widely used in various systems such as 3D reconstruction, automatic driving, autonomous robot navigation, and industrial inspection. In particular, current applications have an urgent need for real-time binocular stereo matching with high precision, high resolution, and large parallax. This undoubtedly poses a higher challenge to the computational efficiency and accuracy of the technology. In recent years, artificial intelligence based on deep learning technology has developed rapidly, and breakthroughs have been made in the fields of object detection, image classification, and speech recognition. Binocular stere...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T3/40G06N3/08G06N3/04G06V10/74
CPCG06T3/4007G06T3/4038G06N3/08G06T2200/32G06N3/045G06F18/22
Inventor 张旭翀孙宏滨戴赫汪航赵永利郑南宁
Owner XI AN JIAOTONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products