Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Unmanned aerial vehicle autonomous positioning method based on visual SLAM (Simultaneous Localization and Mapping)

A self-positioning and unmanned aerial vehicle technology, applied in the field of image processing and computer vision, can solve the problems of increased difficulty and achieve the effects of improved stability and reliability, good scalability, and large amount of calculation

Active Publication Date: 2019-01-15
TIANJIN UNIV
View PDF10 Cites 25 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] With the improvement of social needs, UAVs have more and more functional requirements and application scenarios, which require stronger perception, decision-making and execution capabilities. The structural design and functional design of single UAVs are put forward. High requirements are imposed, and the difficulty of implementation is also greatly increased.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Unmanned aerial vehicle autonomous positioning method based on visual SLAM (Simultaneous Localization and Mapping)
  • Unmanned aerial vehicle autonomous positioning method based on visual SLAM (Simultaneous Localization and Mapping)
  • Unmanned aerial vehicle autonomous positioning method based on visual SLAM (Simultaneous Localization and Mapping)

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0036] Function and characteristics of the present invention are as follows:

[0037] (1) The present invention has a binocular camera for collecting images in front of the drone. With an imu, used to read acceleration and angular velocity information;

[0038] (2) The present invention processes the images collected by the camera through computer vision, visual SLAM and other technologies, screens out key frames through a certain mechanism, and performs feature matching after feature extraction to initially obtain the pose of the drone . Then process the imu information, fuse the two, and finally get the precise position information of the drone through a series of optimizations.

[0039] (3) The method of combining the direct method and the feature point method is adopted, which combines the advantages of the feature point and the direct method, and greatly improves the efficiency.

[0040] (4) When extracting features, the positioning method in the present invention divi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the computer vision and image processing technology, and proposes an autonomous positioning method for an unmanned aerial vehicle. An unmanned aerial vehicle autonomous positioning method based on visual SLAM (Simultaneous Localization and Mapping) is composed of a feature extraction and matching solution motion part, an image and inertia measurement unit imu fusion part and an estimation 3D point depth part; the method comprises the following steps: in the motion solving part, adopting a strategy for combining a feature point method and a direct method, selecting a key frame, after that, extracting point features and line features, and then minimizing an error to complete the calculation of the relative pose; in the image and inertia measurement unit imu fusion part, minimizing the error to fuse the error; and finally, in the estimation 3D point depth part, based on the matching of selected feature points, performing a triangulation method to solve 3D positions of the points, that is, depth values of the points are obtained. The unmanned aerial vehicle autonomous positioning method based on the visual SLAM provided by the invention is mainly applied to computer vision and image processing occasions.

Description

technical field [0001] The invention relates to the fields of computer vision, image processing and the like, and solves the problem of positioning the drone when there is no GPS signal in an unknown environment. Background technique [0002] With the improvement of social needs, UAVs have more and more functional requirements and application scenarios, which require stronger perception, decision-making and execution capabilities. The structural design and functional design of single UAVs are put forward. High requirements are imposed, and the difficulty of implementation is also greatly increased. UAVs are highly flexible and autonomous, and can perform tasks without human intervention or with less intervention, helping humans complete dangerous or repetitive labor. Due to the diversity and complexity of the environment in which UAVs live, they must have the ability to determine their own position before they can perceive the surrounding environment. This is the premise an...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01C21/20G01C21/16
CPCG01C21/165G01C21/20
Inventor 宗群刘彤窦立谦韩天瑞霍新友
Owner TIANJIN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products