Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Depth map confidence estimation method based on convolutional neural network

A convolutional neural network and confidence level technology, applied in the field of quality assessment of depth maps, can solve problems such as low precision and inability to make full use of multi-modal data, and achieve an effect that is beneficial to post-processing

Pending Publication Date: 2021-07-09
苏州中科广视文化科技有限公司
View PDF15 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] Depth image quality assessment is a basic topic in the field of computer vision. The current deep learning-based method is based on the depth image and the original color image output by monocular or binocular stereo matching, and constructs a convolutional neural network to predict a confidence map. This method Cannot make full use of the multi-modal data obtained by multi-objective stereo matching, and produce low accuracy due to the simple network structure

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Depth map confidence estimation method based on convolutional neural network
  • Depth map confidence estimation method based on convolutional neural network
  • Depth map confidence estimation method based on convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. Although exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited by the embodiments set forth herein. Rather, these embodiments are provided for more thorough understanding of the present disclosure and to fully convey the scope of the present disclosure to those skilled in the art.

[0035] The present invention proposes a depth map confidence estimation network based on multi-purpose stereo matching, which preliminarily solves this problem. The truncated symbol distance function map (tsdf) and color map generated by the multi-purpose stereo matching algorithm are analyzed by using the U-Net structure. Feature extraction is performed with the normal map, and the confidence of the depth map is predicted from the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a depth map confidence estimation method based on a convolutional neural network, which is used for quality evaluation and post-processing operation of a depth map generated by a multi-view stereo matching algorithm, and comprises the following steps: calculating a truncated symbol distance function graph and a normal graph by using the depth map generated by the multi-view stereo matching algorithm; performing feature extraction on the truncated symbol distance function graph, the normal graph and the color graph by using a U-shaped network structure to obtain a feature graph; and using a convolutional long-short term memory structure, a prediction module, a refinement module and a multi-supervision method to predict the confidence of the depth map from the feature map and refine an estimation result. According to the method, quality evaluation can be carried out on the depth maps generated by various multi-view stereo matching algorithms, and the depth map confidence degree in multi-view stereo matching can be estimated robustly, so that evaluation of the multi-view stereo matching algorithms and post-processing of the depth maps are facilitated.

Description

technical field [0001] The present invention relates to the fields of computer vision and deep learning, and specifically relates to feature extraction and confidence prediction of intermediate results in multi-view stereo matching by using a convolutional neural network, so as to complete the quality assessment of depth maps. Background technique [0002] Depth image quality assessment is a basic topic in the field of computer vision. The current method based on deep learning is based on the depth image and the original color image output by monocular or binocular stereo matching, and constructs a convolutional neural network to predict a confidence map. This method The multi-modal data obtained by multi-objective stereo matching cannot be fully utilized, and low accuracy is produced due to the simple network structure. Contents of the invention [0003] The purpose of the present invention is to solve the deficiencies of the prior art, provide a depth map confidence esti...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/55G06K9/46G06K9/62G06N3/04G06N3/08
CPCG06T7/55G06N3/049G06N3/08G06T2207/10028G06T2207/10024G06V10/40G06N3/048G06N3/045G06N3/044G06F18/22
Inventor 李兆歆王兆其张小格朱登明朱正刚
Owner 苏州中科广视文化科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products