Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Binocular Depth Estimation Method for Driving Scenes Overcoming Occlusion Effect

A driving scene, depth estimation technology, applied in the field of machine vision and autonomous driving, can solve the problems of poor accuracy, decreased global accuracy, consistent views, etc., to deal with occlusion effects, enhance robustness, and overcome occlusion effects.

Active Publication Date: 2022-08-05
WUHAN UNIV
View PDF7 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although this self-supervised training mechanism can avoid the trouble of collecting real scene depth information, due to the existence of occlusion effects, it is impossible for the reconstructed view to be completely consistent with the original view, which will lead to a decrease in global accuracy.
A network model trained in a self-supervised manner will be much less accurate than a network model trained in a supervised manner

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Binocular Depth Estimation Method for Driving Scenes Overcoming Occlusion Effect
  • A Binocular Depth Estimation Method for Driving Scenes Overcoming Occlusion Effect
  • A Binocular Depth Estimation Method for Driving Scenes Overcoming Occlusion Effect

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0053] In order to clarify the purpose, technical solutions and features of the embodiments of the present invention, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings of the present invention. Obviously, the described embodiments are part, but not all, of the method of implementing the present invention. All other implementation manners obtained by those of ordinary skill in the art without creative efforts fall within the protection scope of the present invention. Accordingly, the following descriptions of messages provided in the accompanying drawings of the present invention are not intended to limit the scope of protection of the claimed invention, but merely represent selected embodiments of the invention. Based on the embodiments of the present invention, all other embodiments obtained by those of ordinary skill in the art without creative efforts shall fall with...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a binocular depth estimation method for driving scenes that overcomes the occlusion effect. This method constructs an end-to-end self-supervised deep neural network model, input images of left and right views of the driving scene, and output disparity maps corresponding to the left and right views. This method uses the geometric constraint relationship between the input and output of the deep neural network model to train the model. It does not need to obtain data samples with labeled information, but only needs to use the binocular camera system to obtain the image pair of the left and right views, which greatly simplifies the workflow. The economic cost is saved, and the model can be trained on images of more types of scenes. The binocular estimation method designed by the present invention can effectively overcome the problems of repeated pattern textures and occlusion of front and rear objects in the scene, and can obtain a higher-precision depth image.

Description

technical field [0001] The invention relates to the fields of machine vision and automatic driving, in particular to a binocular depth estimation method for driving scenes that overcomes occlusion effects and utilizes self-supervised deep learning technology. Background technique [0002] With the advancement of artificial intelligence technology, autonomous driving has been widely studied in academia and industry. As an important part of autonomous driving technology, binocular depth estimation has always been a research hotspot. The binocular depth estimation is based on the binocular camera, shoots two left and right views, obtains the corresponding disparity map from the left and right views, and then calculates the depth image according to the binocular camera parameters. [0003] The traditional binocular depth estimation uses a stereo matching method to find matching corresponding points in the left and right views. However, due to the existence of the occlusion effe...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/55G06N3/04G06N3/08
CPCG06T7/55G06N3/08G06N3/045
Inventor 邹勤黄立
Owner WUHAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products