Supercharge Your Innovation With Domain-Expert AI Agents!

Real-time dense depth estimation method based on sparse measurement and monocular RGB image

A RGB image and depth estimation technology, applied in image enhancement, image analysis, image data processing, etc., can solve the problems of limited computing and storage resources, computationally intensive algorithms cannot be easily adopted, and insufficient utilization of sparse information, etc., to achieve the task The effect of maximizing performance, reducing differences, and improving convergence speed

Pending Publication Date: 2020-12-25
SOUTHEAST UNIV
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Compared to the problem of depth estimation from only one RGB or grayscale image, a major advantage of sparse sample-based methods is that the sparse depth measurements can be considered as part of the output ground truth, however, most current Depth estimation methods based on sparse samples all follow a similar network design to methods based on single-frame RGB images, which leads to insufficient utilization of sparse information
In response to this problem, the present invention attempts to use the self-attention mechanism and long-short dense jump connections to further improve the depth estimation accuracy based on sparse samples. In addition, the research on monocular depth estimation in the past almost all focused on improving the accuracy, resulting in intensive calculations. Type algorithms cannot be easily adopted in robotic systems, since most systems have limited computing and storage resources, especially for tiny devices, a key challenge is to balance the cost of computing time with the accuracy of the algorithm

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Real-time dense depth estimation method based on sparse measurement and monocular RGB image
  • Real-time dense depth estimation method based on sparse measurement and monocular RGB image
  • Real-time dense depth estimation method based on sparse measurement and monocular RGB image

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] The present invention will be further explained below in conjunction with the accompanying drawings and specific embodiments. It should be understood that the following specific embodiments are only used to illustrate the present invention and are not intended to limit the scope of the present invention.

[0030] The present invention uses the indoor data set NYU-Depth-v2 and the outdoor data set KITTI as our experimental data sets to verify the proposed method of real-time dense depth estimation based on sparse measurements and monocular RGB images. The experimental platform includes Pytorch0.4.1, Python3.6, Ubuntu16.04 and NVIDIA TiTanV GPU. The NYU-Depth-v2 dataset is composed of high-quality 480×640RGB and depth data collected by Kinect. According to the official split of the data, there are 249 scenes containing 26331 pictures for training, and 215 scenes containing 654 pictures for testing. The KITTI mapping dataset consists of 22 sequences, including camera and ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a real-time dense depth estimation method based on sparse measurement and a monocular RGB image, and the method comprises the steps of extracting more useful information from sparse depth measurement through employing a self-attention mechanism and a long and short dense jump connection technology. Meanwhile, a lightweight network design method for real-time depth estimation is provided in combination with a depth supervision technology. An experiment result verifies the effectiveness of a self-attention mechanism, a long and short dense jump connection technology and adeep supervision technology. Experimental results show that by using the method provided by the invention, the network prediction precision and the reasoning speed can be balanced to the greatest extent so as to obtain the maximum efficiency. According to the depth error estimated in real time through the method, under the condition that the sparse sampling rate is smaller than 1 / 10000, the precision of an indoor data set NYU-Depth-v2 is within 30 cm, and the precision of an outdoor data set KITTI is within 4 m.

Description

technical field [0001] The invention belongs to the technical field of robot vision positioning and navigation, and in particular relates to a real-time dense depth estimation method based on sparse measurement and monocular RGB images. Background technique [0002] Dense depth estimation plays an important role in fields such as unmanned aerial vehicles, intelligent navigation, and augmented reality. The current mainstream depth acquisition scheme consists of a high-resolution camera and a low-resolution depth sensor. These sensors are usually expensive and cannot obtain dense depth, so they are not practical for most applications. Furthermore, the accuracy and reliability of RGB-based depth estimation is far from being practical, despite more than a decade of research efforts dedicated to improving it through deep learning methods. Therefore, high-accuracy dense real-time depth estimation from single images and sparse depth measurements acquired by monocular cameras and l...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/50
CPCG06T7/50G06T2207/10028G06T2207/20081G06T2207/20084
Inventor 潘树国赵涛高旺魏建胜盛超
Owner SOUTHEAST UNIV
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More