Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Real-time dense monocular SLAM method and system based on online learning depth prediction network

A depth prediction and dense technology, applied in biological neural network models, image analysis, instruments, etc., can solve the problems of lack of scale information and inability to realize dense mapping in monocular SLAM, and achieve the effect of improving accuracy and accuracy

Active Publication Date: 2018-04-20
HUAZHONG UNIV OF SCI & TECH
View PDF2 Cites 51 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In view of the above defects or improvement needs of the prior art, the present invention provides a method and system that combines online learning depth prediction network with monocular SLAM, the purpose of which is to make full use of the advantages of deep convolutional neural network to achieve Dense depth estimation of the key frames of the SLAM system, and restore the real scale information of the scene based on the results, thereby solving the technical problems of lack of scale information and inability to achieve dense mapping in traditional monocular SLAM

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Real-time dense monocular SLAM method and system based on online learning depth prediction network
  • Real-time dense monocular SLAM method and system based on online learning depth prediction network
  • Real-time dense monocular SLAM method and system based on online learning depth prediction network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0045] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention. In addition, the technical features involved in the various embodiments of the present invention described below can be combined with each other as long as they do not constitute a conflict with each other.

[0046] The problem to be solved by the present invention is to realize a real-time monocular dense mapping SLAM system. The system adopts the combination of self-adaptive online CNN depth prediction network and monocular SLAM system based on direct method, which can not only significantly improve the performance of unknown scene The accuracy and robustness of depth prediction can also solve the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a real-time dense monocular simultaneous localization and mapping (SLAM) method based on an online learning depth prediction network. The method comprises: optimization of a luminosity error of a minimized high gradient point is carried out to obtain a camera attitude of a key frame and the depth of the high gradient point is predicted by using a trigonometric survey methodto obtain a semi-dense map of a current frame; an online training image pair is selected, on-line training and updating of a CNN network model are carried out by using a block-by-block stochastic gradient descent method, and depth prediction is carried out on the current frame of picture by using the trained CNN network model to obtain a dense map; depth scale regression is carried out based on the semi-dense map of the current frame and the predicted dense map to obtain an absolute scale factor of depth information of the current frame; and with an NCC score voting method, all pixel depth prediction values of the current frame are selected based on two kinds of projection results to obtain a predicted depth map, and Gaussian fusion is carried out on the predicted depth map to obtain a final depth map. In addition, the invention also provides a corresponding real-time dense monocular SLAM system based on an online learning depth prediction network.

Description

technical field [0001] The invention belongs to the technical field of computer vision three-dimensional reconstruction, and more specifically relates to a real-time dense monocular SLAM method and system based on an online learning depth prediction network. Background technique [0002] Simultaneous Localization And Mapping (SLAM) technology can predict the pose of the sensor in real time and reconstruct a 3D map of the surrounding environment, so it plays an important role in the fields of UAV obstacle avoidance and augmented reality. Among them, a SLAM system that only relies on a single camera as an input sensor is called a monocular SLAM system. Monocular SLAM has the characteristics of low power consumption, low hardware threshold and simple operation, and is widely used by researchers. However, the existing popular monocular SLAM systems, whether they are feature-based PTAM (Parallel Tracking And Mapping For Small AR Workspaces) and ORB-SLAM (Orb-slam: AVersatile And...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T17/00G06T7/70G06N3/04
CPCG06T7/70G06T17/00G06T2200/08G06N3/045
Inventor 杨欣罗鸿城高杨吴宇豪
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products