Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Scene matching/visual odometry-based inertial integrated navigation method

A technology of scene matching and inertial combination, applied in navigation, navigation through speed/acceleration measurement, surveying and navigation, etc., can solve the problem of reliability and accuracy that cannot meet the requirements of high-precision navigation, integrated navigation system errors, GPS signal power Weakness and other problems, to achieve the effect of improving autonomous flight capability, strong anti-interference, and high positioning accuracy

Active Publication Date: 2014-07-30
深圳市欧诺安科技有限公司
View PDF6 Cites 55 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] At present, the most important way of UAV navigation is the INS / GPS integrated navigation system, but in the unknown and dynamically changing complex environment, the GPS signal power is weak, vulnerable to electromagnetic interference, and even stops working in the signal blind area, resulting in combination The navigation system has a huge error, resulting in unpredictable consequences, and my country's Beidou satellite navigation system is in a stage of continuous development, and its reliability and accuracy are still difficult to meet the high-precision navigation requirements such as military applications

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Scene matching/visual odometry-based inertial integrated navigation method
  • Scene matching/visual odometry-based inertial integrated navigation method
  • Scene matching/visual odometry-based inertial integrated navigation method

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment

[0062] 1. During the flight of the UAV, the airborne downward-looking camera acquires the ground image a in real time.

[0063] Use the downward-looking optical camera or infrared camera onboard the UAV to obtain the ground image sequence in real time, but only need to save the current frame and the previous frame image.

[0064] 2. Using the image a and the previous frame image a', by estimating the homography matrix of a and a', determine the visual mileage of the UAV.

[0065] like figure 2 As shown, when the UAV is flying, the onboard camera continuously shoots two frames of images I in different poses 1 and I 2 , the corresponding camera coordinate system is F and F′, assuming that the point P on the plane π is mapped to the image I 1 Points p and I in 2 The point p' in the corresponding vector in F and F' is and exists

[0066] p → ′ = R c ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a scene matching / visual odometry-based inertial integrated navigation method. The method comprises the following steps: calculating the homography matrix of an unmanned plane aerial photography real time image sequence according to a visual odometry principle, and carrying out recursive calculation by accumulating a relative displacement between two continuous frames of real time graph to obtain the present position of the unmanned plane; introducing an FREAK characteristic-based scene matching algorithm because of the accumulative error generation caused by the increase of the visual odometry navigation with the time in order to carry out aided correction, and carrying out high precision positioning in an adaption zone to effectively compensate the accumulative error generated by the long-time work of the visual odometry navigation, wherein the scene matching has the advantages of high positioning precision, strong automaticity, anti-electromagnetic interference and the like; and establishing the error model of the inertial navigation system and a visual data measuring model, carrying out Kalman filtering to obtain an optimal estimation result, and correcting the inertial navigation system. The method effectively improves the navigation precision, and is helpful for improving the autonomous flight capability of the unmanned plane.

Description

technical field [0001] The invention belongs to the technical field of unmanned aerial vehicle navigation and positioning, and relates to an inertial integrated navigation method based on scene matching / visual mileage. Background technique [0002] High-precision, high-dynamic and high-reliability autonomous navigation is one of the key technologies to ensure that UAVs can successfully complete various tasks. It is of great significance to enhance the autonomous behavior of UAVs and improve combat effectiveness. Navigation methods are divided into satellite navigation, radio navigation, inertial navigation, etc. Among them, inertial navigation (INS) occupies a special position in navigation technology due to its outstanding advantages of high autonomy. The existing UAV navigation systems are all based on inertial navigation. The core constitutes an integrated navigation system to complete the autonomous navigation of UAVs in complex environments. [0003] At present, the mo...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/16G01C21/00
CPCG01C21/165G01C25/005
Inventor 赵春晖王荣志张天武潘泉马鑫
Owner 深圳市欧诺安科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products