Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Depth completion method for sparse depth map, computer device and storage medium

A depth map and completion technology, applied in the field of image processing, can solve the problem that sparse depth maps cannot meet the requirements of dense depth maps

Pending Publication Date: 2022-05-06
中山大学深圳 +1
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] Aiming at at least one technical problem such as the sparse depth map obtained in the current depth perception technology cannot meet the requirements for dense depth maps in practical applications, the purpose of the present invention is to provide a depth complement method for sparse depth maps, a computer device and a storage device. medium

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Depth completion method for sparse depth map, computer device and storage medium
  • Depth completion method for sparse depth map, computer device and storage medium
  • Depth completion method for sparse depth map, computer device and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0051] In this embodiment, the depth completion method of the sparse depth map includes the following steps:

[0052] S1. Train the neural network;

[0053] S2. Obtain the image to be processed;

[0054] S3. Use the trained neural network to perform depth completion on the image to be processed.

[0055] In step S1, the neural network to be trained includes a sampling module, a first encoding network, a first decoding network, a pixel correlation calculation module, a pixel correlation optimization module and an image consistency optimization module. By training the neural network, the neural network can have the performance of depth completion. Therefore, before describing the depth completion method, the neural network training method, that is, step S1, will be described first.

[0056]refer to figure 1 , the neural network training method, that is, the process of training the neural network in step S1, includes the following steps:

[0057] P1. Get a color image;

[00...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a depth completion method for a sparse depth map, a computer device and a storage medium, and the process of training a neural network in the depth completion method comprises the steps: obtaining a color image and a depth map truth value, carrying out the equal-distance sampling of the depth map truth value, and obtaining a sparse depth map. The method comprises the following steps: extracting a color image and a sparse depth map to obtain a multi-scale feature map, carrying out regression on the multi-scale feature map to obtain an initial depth map, calculating pixel correlation, carrying out multi-time iterative filtering on the initial depth map to obtain a dense depth map, and carrying out multi-round iterative processing on the dense depth map to train an image consistency optimization module. According to the method, the neural network is trained, the prediction effect of the image consistency optimization module on the pixel depth at the boundary of the dense depth map can be improved, so that the image consistency optimization module has the capability of predicting the pixel depth according to the dense depth map, and the neural network has the capability of depth completion. The method is widely applied to the technical field of image processing.

Description

technical field [0001] The present invention relates to the technical field of image processing, in particular to a depth completion method for a sparse depth map, a computer device and a storage medium. Background technique [0002] Depth perception is the basic task of 3D vision. Through depth perception, the distance between the point on the object represented by each pixel in the image and a certain reference plane can be obtained, which provides important applications for automatic driving control, robotics, augmented reality, etc. control parameters. Depth perception technology has achieved rapid development in recent years, but how to obtain high-precision, high-resolution depth maps at low cost is still a challenging task. Low-cost, low-power depth sensors usually can only obtain low-resolution, very sparse depth maps, lacking depth information of a large number of pixels, while practical applications require dense depth maps. [0003] Terminology Explanation: [...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/50G06V10/82G06N3/04G06N3/08
CPCG06T7/50G06N3/08G06T2207/20081G06T2207/20084G06N3/045
Inventor 郭裕兰杜沛峰胡俊
Owner 中山大学深圳
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products