Unmanned aerial vehicle scene dense reconstruction method based on VI-SLAM and depth estimation network

A VI-SLAM and depth estimation technology, applied in the field of virtual reality, can solve the problem that large-scale scenes cannot be reconstructed quickly and densely

Active Publication Date: 2021-03-02
BEIHANG UNIV
View PDF4 Cites 31 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The technology of the present invention solves the problem: overcomes the problem of the prior art that large-scale scenes cannot be quickly and densely reconstructed, and provides a dense reconstruction method for unmanned aerial vehicle scenes based on VI-SLAM and depth estimation network, by tracking the camera pose in real time, and Estimate the depth information of the scene through the depth estimation of a single image, which can achieve faster operating efficiency during heavy and dense construction.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Unmanned aerial vehicle scene dense reconstruction method based on VI-SLAM and depth estimation network
  • Unmanned aerial vehicle scene dense reconstruction method based on VI-SLAM and depth estimation network
  • Unmanned aerial vehicle scene dense reconstruction method based on VI-SLAM and depth estimation network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0060] Below in conjunction with accompanying drawing and implementation example the present invention is described in further detail:

[0061] The basic operation of the UAV scene reconstruction method in the present invention is to use the UAV equipped with IMU components to take pictures of the three-dimensional environment, transmit the acquired information to the back end for processing, and output the densely reconstructed point cloud rendering of the UAV scene .

[0062] Such as figure 1 As shown, the steps of the UAV three-dimensional reconstruction method based on VI-SLAM and depth estimation network of the present invention are as follows:

[0063] (1) Fix the inertial navigation device IMU to the drone, and calibrate the internal and external parameters of the drone's own monocular camera and the external parameters of the IMU;

[0064] (2) Use the UAV monocular camera and IMU to collect the image sequence and IMU information of the UAV scene;

[0065] (3) Use V...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to an unmanned aerial vehicle scene dense reconstruction method based on VISLAM and depth estimation, and the method comprises the steps: (1) fixing an inertial navigation unit IMU to an unmanned aerial vehicle, and calibrating the internal parameters and external parameters of a monocular camera of the unmanned aerial vehicle and the external parameters of the IMU; (2) collecting an image sequence and IMU information of an unmanned aerial vehicle scene by using an unmanned aerial vehicle monocular camera and an IMU; (3) processing the image and the IMU information acquired in the step (2) by using VISLAM to obtain a camera pose with scale information; (4) inputting the monocular image information as an original view into a viewpoint generation network to obtain a right view, and inputting the original view and the right view into a depth estimation network to obtain depth information of the image; (5) combining the camera attitude obtained in the step (3) with the depth map obtained in the step (4) to obtain a local point cloud; and (6) through point cloud optimization and registration, fusing the SLAM tracking trajectory with the local point cloud to obtainan unmanned aerial vehicle scene dense point cloud model.

Description

technical field [0001] The invention relates to a dense reconstruction method of unmanned aerial vehicle scenes based on VI-SLAM (Visual Inertial Navigation Fusion Simultaneous Positioning and Mapping) and a depth estimation network, belonging to the field of virtual reality. Background technique [0002] Three-dimensional reconstruction refers to the establishment of a mathematical model suitable for computer representation and processing of three-dimensional objects. It is the basis for processing, operating and analyzing its properties in a computer environment. It is also a key technology for establishing a virtual reality that expresses the objective world in a computer. With the increasing demand for 3D reconstruction and the increasing popularity of UAV aerial photography, point cloud reconstruction based on UAV aerial images has become a research hotspot. [0003] The traditional dense reconstruction methods using depth cameras or SfM require relatively complex hardw...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T17/00G06T7/33G06T7/55G06K9/46
CPCG06T17/00G06T7/33G06T7/55G06V10/464Y02T10/40
Inventor 周忠吕文官温佳伟闫飞虎柳晨辉
Owner BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products