Real-time 3D scene reconstruction method for UAV based on real-time simultaneous positioning and map construction

A technology for map construction and real-time synchronization, which is applied in 3D image processing, image analysis, image enhancement, etc. It can solve the problems of only geometric structure, waste of transmission bandwidth and computing resources, and 3D point cloud without texture information. The requirement of repetition rate, the effect of improving fidelity

Active Publication Date: 2022-04-19
NORTHWESTERN POLYTECHNICAL UNIV
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] 2) The redundancy of the initial image information obtained by the camera is extremely high, resulting in a waste of transmission bandwidth and computing resources
[0007] 3) At present, most of the large-scale maps generated by drones in real time using computer vision technology are 3D point clouds that only have geometric structures and do not contain texture information.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Real-time 3D scene reconstruction method for UAV based on real-time simultaneous positioning and map construction
  • Real-time 3D scene reconstruction method for UAV based on real-time simultaneous positioning and map construction
  • Real-time 3D scene reconstruction method for UAV based on real-time simultaneous positioning and map construction

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0086] Embodiments of the present invention are described in detail below, and the embodiments are exemplary and intended to explain the present invention, but should not be construed as limiting the present invention.

[0087] The main device of the drone of the present invention is a microcomputer and an airborne industrial camera. During the flight of the UAV, the image information is collected by the camera, and the data is transmitted to the embedded computer. The embedded computer processes the collected images, completes real-time positioning and point cloud map construction, and combines GPS reconstruction provided by the flight controller. 3D map scene.

[0088] The following are the specific implementation steps:

[0089] Step 1: Acquire and process images:

[0090] The drone's on-board camera collects a series of images, and transmits the images to the computing unit in real time, ensuring the rapidity of image transmission. Use the camera calibration data obtain...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The present invention proposes a real-time 3D scene reconstruction method for UAVs based on EG-SLAM. The method utilizes the visual information obtained in real time by the onboard camera of the UAV to reconstruct a large-scale 3D scene with texture details. Compared with many existing methods, the present invention directly runs on the CPU after collecting images, realizes positioning and reconstructs a three-dimensional map quickly and in real time; and does not adopt the traditional PNP method to solve the pose of the UAV, but uses the invention The EG‑SLAM method to solve the pose of the UAV, that is, to use the feature point matching relationship between two frames to directly solve the pose, reducing the requirement for the repetition rate of the captured image; in addition, the large amount of environmental information obtained makes the UAV With a more precise and detailed perception of the environmental structure, texture rendering is performed on the large-scale 3D point cloud map generated in real time, which realizes the reconstruction of the large-scale 3D map and obtains a more intuitive and realistic 3D scene.

Description

technical field [0001] The present invention relates to the fields of unmanned aerial vehicle autonomous flight, computer vision, and map construction. Specifically, it uses real-time synchronous positioning and map construction (EG-SLAM) to perform dense diffusion and texture rendering on the obtained 3D point cloud to realize the realization of unmanned aerial vehicles. 3D scene reconstruction of ground objects. Background technique [0002] Real-time simultaneous positioning and reconstruction of 3D scenes by UAVs has always been a hotspot and difficulty in the field of UAV autonomous flight and computer vision. UAVs have many advantages, such as: small size, low cost, good stealth, flexibility and convenience, and strong combat capability. In recent years, they have been widely used in many fields of military and civilian. In terms of military affairs, drones can be used for military tasks such as reconnaissance, attack, and electronic countermeasures. They can also be ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06T17/00G06T7/33G06T15/04
CPCG06T7/33G06T15/04G06T17/00G06T2207/10021
Inventor 布树辉张咪赵勇
Owner NORTHWESTERN POLYTECHNICAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products