Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Autonomous landing method of unmanned aerial vehicle based on vision/inertial navigation

A technology for autonomous landing and unmanned aerial vehicles, applied in the field of integrated navigation, which can solve the problems of error accumulation, not being able to be used alone, poor vertical positioning function, etc., and achieve the effect of improving real-time performance and high precision

Active Publication Date: 2017-05-24
NO 20 RES INST OF CHINA ELECTRONICS TECH GRP
View PDF4 Cites 65 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, inertial navigation equipment is a time integration system, its error accumulates rapidly with time and its vertical positioning function is not good, so it cannot be used alone, so it is necessary to introduce another navigation source to correct the long-term accumulated error of inertial navigation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Autonomous landing method of unmanned aerial vehicle based on vision/inertial navigation
  • Autonomous landing method of unmanned aerial vehicle based on vision/inertial navigation
  • Autonomous landing method of unmanned aerial vehicle based on vision/inertial navigation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0042] The present invention will be further described below in conjunction with the accompanying drawings and embodiments, and the present invention includes but not limited to the following embodiments.

[0043] Combined navigation method of the present invention is according to following steps 1) to step 4) on the basis of cycle operation:

[0044] Step 1) Use the visual navigation algorithm to solve the position and pose of the drone, including 1.1) to 1.8);

[0045] 1.1) Utilize the UAV airborne camera to obtain real-time images, go to step 1.2);

[0046] 1.2) judge whether the real-time image acquired is the first frame image, if go to step 1.3), otherwise go to step 1.4);

[0047] 1.3) The image processing process of the first frame is as follows:

[0048] 1.3.1) Preprocessing the image acquired in real time, extracting the SURF feature points of the real-time image, and turning to step 1.3.2);

[0049] 1.3.2) carry out SURF matching with the SURF feature points extr...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an autonomous landing method of an unmanned aerial vehicle based on vision / inertial navigation. The autonomous landing method comprises the following steps of firstly, utilizing a vision navigation algorithm to solve the location and poses of the unmanned aerial vehicle; then, utilizing the pose of the unmanned aerial vehicle solved by the vision navigation as an initial value of inertial navigation, and starting to solve the inertia navigation parameters; utilizing the parameters obtained from the adjacent period of inertial navigation to remove the false matching dot pairs of real-time images and reference images after SURF matching; finally, utilizing non-track Kalman filter combined with navigation parameters to adjust the poses of the unmanned aerial vehicle in real time to guide landing. The autonomous landing method has the advantages that the timeliness of the vision navigation algorithm is improved, and the vision system maintains high precision for a long time; the problem of failure to singly use the error dispersion of single inertial navigation is solved, and the carrier navigation parameters can be provided even if the vision navigation solution fails.

Description

technical field [0001] The invention relates to an autonomous landing method for an unmanned aerial vehicle, which belongs to the field of integrated navigation. Background technique [0002] Visual navigation technology is a technology that uses digital image processing technology to process and analyze aerial images acquired by airborne image sensors, and finally obtains the pose parameters required for UAV navigation control. The large amount of information in image processing makes it impossible to satisfy the real-time performance of UAV in visual navigation and positioning, so a fast real-time image matching algorithm is needed to solve this problem. [0003] Inertial navigation technology is a traditional autonomous navigation technology that does not receive external radio signals and does not radiate energy to the outside. Inertial navigation equipment has all-weather, all-time and space-time working capabilities and good concealment, and can provide short-term hig...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G05D1/06G01C11/00G01C11/06G01C21/16
CPCG01C11/00G01C11/06G01C21/165G05D1/0676
Inventor 高嘉瑜李江武云云陈佳原彬
Owner NO 20 RES INST OF CHINA ELECTRONICS TECH GRP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products