Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

UAV slam method based on hybrid visual odometry and multi-scale map

A technology of visual odometer and UAV, which is applied in the direction of navigation computing tools, navigation, instruments, etc., can solve the problems that are not suitable, the feature point method and the direct method cannot well adapt to the positioning requirements of UAV, and achieve relief Calculating pressure, realizing real-time pose estimation and environment perception, and improving safety performance

Active Publication Date: 2021-08-10
NANJING UNIV OF AERONAUTICS & ASTRONAUTICS
View PDF7 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Considering that UAVs have relatively high requirements for stability, real-time, and accuracy, neither the traditional feature point method nor the direct method can well meet the positioning requirements of UAVs. It is necessary to design a stable and accurate visual odometry method
At the same time, map construction requires relatively high computing power and is not suitable for direct processing on UAVs. It needs to rely on ground stations with powerful computing power.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • UAV slam method based on hybrid visual odometry and multi-scale map
  • UAV slam method based on hybrid visual odometry and multi-scale map
  • UAV slam method based on hybrid visual odometry and multi-scale map

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0058] The present invention will be described in detail below with reference to the accompanying drawings.

[0059] figure 1 A system block diagram of a mixed visual milemeter of the present invention, a mixed visual milemeter contains two threads based on a single-visual visual milemeter based on a direct method and a biopsy milemeter based on the feature point method. The drone platform is equipped with a single-eyed camera and a front-view binocular camera. The single-grade camera acquires 30 frames per second, and the two-machine camera acquires 10 frames per second. The airborne computer simultaneously turns on two threads, and the main thread runs a visual milemeter based on the direct method at a frequency of 30 Hz, and the other thread runs a visual milemeter based on the feature point method at a frequency of 10 Hz. If you get the synchronized single-grade camera image, Image corresponding to the left and right camera of the double-discipline camera will As an input ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a SLAM method for an unmanned aerial vehicle based on a hybrid visual odometer and a multi-scale map, belonging to the technical field of autonomous navigation of an unmanned aerial vehicle. In this method, the unmanned aerial vehicle platform is equipped with a downward-looking monocular camera, a forward-looking binocular camera, and an on-board computer. The monocular camera is used for the visual odometry based on the direct method, and the binocular camera is used for the visual odometry based on the feature point method. The hybrid visual odometry combines the output of these two visual odometry to construct a local map for positioning and obtain the real-time pose of the UAV. Then feed back the pose to the flight control system to control the position of the drone. The onboard computer transmits the real-time pose and collected images to the ground station, and the ground station plans the flight path in real time based on the constructed global map, and sends the waypoint information to the UAV to realize the autonomous flight of the UAV. The invention realizes real-time pose estimation and environment perception of the UAV in a non-GPS environment, and greatly improves the intelligence level of the UAV.

Description

Technical field [0001] The present invention relates to a drone SLAM method based on a hybrid visual milemeter and a multi-scale map, belonging to the field of autonomous navigation technology. Background technique [0002] With the development of Unmanned Aerial Vehicle, UAV technology, various types of drones are widely used, and drones have also begun to play an increasingly important role in industrial fields. Drone is no longer It is a simple model aircraft. How to perceive the environment and its own status information has become one of the key technologies of drones, that is, the synchronization positioning and composition of the robotic field study and the SLAM. [0003] Compared to ground mobile robots, the load of drones is limited, and the real-time requirements are high. The traditional laser radar sensor is easily unsuitable for the drone platform due to the weight of the light, and is easily suitable for the drone platform, so the visual sensor is more suitable for ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G01C21/00G01C21/20
CPCG01C21/005G01C21/20
Inventor 刘阳王从庆李翰
Owner NANJING UNIV OF AERONAUTICS & ASTRONAUTICS
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products