Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Driving scene binocular depth estimation method for overcoming shielding effect

A driving scene and occlusion effect technology, applied in the field of binocular depth estimation in driving scenes, can solve problems such as poor precision, global precision drop, and consistent views

Active Publication Date: 2020-05-05
WUHAN UNIV
View PDF7 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although this self-supervised training mechanism can avoid the trouble of collecting real scene depth information, due to the existence of occlusion effects, it is impossible for the reconstructed view to be completely consistent with the original view, which will lead to a decrease in global accuracy.
A network model trained in a self-supervised manner will be much less accurate than a network model trained in a supervised manner

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Driving scene binocular depth estimation method for overcoming shielding effect
  • Driving scene binocular depth estimation method for overcoming shielding effect
  • Driving scene binocular depth estimation method for overcoming shielding effect

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0053] In order to make the purpose, technical solutions and description of the features of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the accompanying drawings of the present invention. Apparently, the described embodiments are part, not all, of the methods for implementing the present invention. All other implementations obtained by persons of ordinary skill in the art without creative efforts fall within the protection scope of the present invention. Therefore, the following description of the messages provided in the drawings of the present invention is not intended to limit the protection scope of the claimed invention, but merely represents selected embodiments of the present invention. Based on the implementation manners in the present invention, all other implementation manners obtained by persons of ordinary skill in the art wit...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a driving scene binocular depth estimation method capable of overcoming a shielding effect. According to the method, an end-to-end self-supervised deep neural network model isconstructed, images of left and right views of a driving scene are input, and disparity maps corresponding to the left and right views are output. According to the method, the geometric constraint relation between the input and the output of the deep neural network model is used for training the model, a data sample with annotation information does not need to be obtained, only a binocular camerasystem needs to be used for obtaining the image pairs of the left view and the right view, the working process is greatly simplified, the economic cost is saved, and the model can train images of moretypes of scenes. The binocular estimation method designed by the invention can effectively overcome the problems of repeated pattern textures, front and back object shielding and the like in a scene,and can obtain a depth image with relatively high precision.

Description

technical field [0001] The invention relates to the fields of machine vision and automatic driving, in particular to a binocular depth estimation method for driving scenes using self-supervised deep learning technology to overcome occlusion effects. Background technique [0002] With the advancement of artificial intelligence technology, automatic driving has been extensively studied in academia and industry. As an important part of automatic driving technology, binocular depth estimation has always been a research hotspot. The binocular depth estimation is based on the binocular camera, and the left and right views are taken, and the corresponding disparity maps are obtained from the left and right views, and then the depth image is calculated according to the binocular camera parameters. [0003] The traditional binocular depth estimation adopts the method of stereo matching to find the matching corresponding points in the left and right views. However, due to the occlusi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/55G06N3/04G06N3/08
CPCG06T7/55G06N3/08G06N3/045
Inventor 邹勤黄立
Owner WUHAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products