Real-time three-dimensional scene reconstruction method for UAV based on EG-SLAM

A real-time 3D and scene reconstruction technology, applied in the fields of drone autonomous flight, computer vision, and map construction, can solve the problems of 3D point clouds that do not contain texture information, waste of transmission bandwidth and computing resources, and only geometric structures

Active Publication Date: 2018-10-12
NORTHWESTERN POLYTECHNICAL UNIV
View PDF6 Cites 44 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] 2) The redundancy of the initial image information obtained by the camera is extremely high, resulting in a waste of transmission bandwidth and computing resources
[0007] 3) At present,

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Real-time three-dimensional scene reconstruction method for UAV based on EG-SLAM
  • Real-time three-dimensional scene reconstruction method for UAV based on EG-SLAM
  • Real-time three-dimensional scene reconstruction method for UAV based on EG-SLAM

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0086] Embodiments of the present invention are described in detail below, and the embodiments are exemplary and intended to explain the present invention, but should not be construed as limiting the present invention.

[0087] The main device of the drone of the present invention is a microcomputer and an airborne industrial camera. During the flight of the UAV, the image information is collected by the camera, and the data is transmitted to the embedded computer. The embedded computer processes the collected images, completes real-time positioning and point cloud map construction, and combines GPS reconstruction provided by the flight controller. 3D map scene.

[0088] The following are the specific implementation steps:

[0089] Step 1: Acquire and process images:

[0090] The drone's on-board camera collects a series of images, and transmits the images to the computing unit in real time, ensuring the rapidity of image transmission. Use the camera calibration data obtain...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The present invention provides a real-time three-dimensional scene reconstruction method for a UAV (unmanned aerial vehicle) based on the EG-SLAM. The method is characterized in that: visual information is acquired by using an unmanned aerial camera to reconstruct a large-scale three-dimensional scene with texture details. Compared with multiple existing methods, by using the method provided by the present invention, images are collected to directly run on the CPU, and positioning and reconstructing a three-dimensional map can be quickly implemented in real time; rather than using the conventional PNP method to solve the pose of the UAV, the EG-SLAM method of the present invention is used to solve the pose of the UAV, namely, the feature point matching relationship between two frames is used to directly solve the pose, so that the requirement for the repetition rate of the collected images is reduced; and in addition, the large amount of obtained environmental information can make theUAV to have a more sophisticated and meticulous perception of the environment structure, texture rendering is performed on the large-scale three-dimensional point cloud map generated in real time, reconstruction of a large-scale three-dimensional map is realized, and a more intuitive and realistic three-dimensional scene is obtained.

Description

technical field [0001] The present invention relates to the fields of unmanned aerial vehicle autonomous flight, computer vision, and map construction. Specifically, it uses real-time synchronous positioning and map construction (EG-SLAM) to perform dense diffusion and texture rendering on the obtained 3D point cloud to realize the realization of unmanned aerial vehicles. 3D scene reconstruction of ground objects. Background technique [0002] Real-time simultaneous positioning and reconstruction of 3D scenes by UAVs has always been a hotspot and difficulty in the field of UAV autonomous flight and computer vision. UAVs have many advantages, such as: small size, low cost, good stealth, flexibility and convenience, and strong combat capability. In recent years, they have been widely used in many fields of military and civilian. In terms of military affairs, drones can be used for military tasks such as reconnaissance, attack, and electronic countermeasures. They can also be ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T17/00G06T7/33G06T15/04
CPCG06T7/33G06T15/04G06T17/00G06T2207/10021
Inventor 布树辉张咪赵勇
Owner NORTHWESTERN POLYTECHNICAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products