Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Inertial vision integrated navigation method based on optical flow method

A technology of combined navigation and optical flow method, applied in navigation, mapping and navigation, and navigation through speed/acceleration measurement, etc. The effect of improving adaptability

Active Publication Date: 2019-03-29
HARBIN INST OF TECH
View PDF7 Cites 32 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to solve the inaccurate navigation information of the existing inertial navigation; when the existing satellite navigation is indoors or blocked by buildings, shielding interruption or excessive dynamic error will occur; and the existing visual navigation algorithm is relatively complicated and susceptible to In view of the influence of camera pose, illumination change, image noise, etc., an inertial vision integrated navigation method based on optical flow method is proposed

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Inertial vision integrated navigation method based on optical flow method
  • Inertial vision integrated navigation method based on optical flow method
  • Inertial vision integrated navigation method based on optical flow method

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach 1

[0021] Specific implementation mode one: the specific process of this embodiment is: a kind of inertial visual integrated navigation method based on optical flow method The specific process is:

[0022] Step 1. Define the world coordinate system O w x w Y w Z w , UAV body coordinate system O b x b Y b Z b , camera coordinate system O c x c Y c Z c , physical imaging coordinate system O 1 xy and pixel image plane coordinate system Ouv;

[0023] Step 2. Three sensors, IMU, camera and altimeter, are mounted on the UAV, and images are collected through the camera, and the collected images are collected according to the following figure 2 The process of pyramid LK solution is used to obtain two-dimensional optical flow data;

[0024] The IMU consists of a gyroscope and an accelerometer;

[0025] Step 3. Convert the two-dimensional optical flow data information obtained in step two into three-dimensional navigation information, that is, the position of the drone in the...

specific Embodiment approach 2

[0031] Specific embodiment two: the difference between this embodiment and specific embodiment one is that the world coordinate system O is defined in the step one w x w Y w Z w , UAV body coordinate system O b x b Y b Z b , camera coordinate system O c x c Y c Z c , physical imaging coordinate system O 1 xy and pixel image plane coordinate system Ouv; specifically:

[0032] a. World coordinate system O w x w Y w Z w :

[0033] Use the North-East-Earth coordinate system as the world coordinate system, the origin of the world coordinate system O w is the projection of the initial position of the UAV on the ground, the coordinate axis O w x w Pointing to Earth North, O w Y w Pointing Earth East, O w Z w Perpendicular to the Earth's surface and pointing down; the world coordinate system is a fixed coordinate system;

[0034] b. UAV body coordinate system O b x b Y b Z b :

[0035] The origin of the body coordinate system O b Taken at the center of mas...

specific Embodiment approach 3

[0043] Specific embodiment three: the difference between this embodiment and specific embodiment one or two is that in the second step, the UAV is equipped with three sensors: IMU, camera and altimeter, and the image is collected by the camera. figure 2 The procedure is to solve the pyramid LK to obtain two-dimensional optical flow data; the specific process is:

[0044] Step 2-1, converting the two frames before and after the color image collected by the camera into a grayscale image;

[0045] Step 2-2, using the Shi-Tomasi corner detection method to find a certain number of feature points in the previous frame, and accurate the coordinates of the feature points in the previous frame to sub-pixel accuracy;

[0046] Step 2-3, adopting the LK algorithm of pyramid thinking to detect the position of the feature point identified in the previous frame in the next frame and determine the coordinates of the feature point in the next frame;

[0047] Step 2-4. Finally, according to th...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an inertial vision integrated navigation method based on an optical flow method, relates to an inertial vision integrated navigation method, and has the purpose of solving theproblems that the navigation information is inaccurate due to cumulative errors caused by the existing inertial navigation working for a long time, a mask interruption or an excessive dynamic error may happen when the existing satellite navigation is indoor or shielded by a building, and the algorithm of the existing vision navigation is complex and susceptible to camera pose, illumination changes, image noise and the like. The method comprises the following processes: 1, defining a coordinate system; 2, carrying three sensors of an IMU (Inertial Measurement Unit), a camera and an altimeter onthe drone to obtain two-dimensional optical flow data; 3, obtaining the position of the drone in the world coordinate system; 4, performing inertial navigation according to the IMU measuring information to calculate the position and attitude of the drone in the world coordinate system; and 5, obtaining the position and attitude information of the merged drone in the world coordinate system. The method is used in the technical field of autonomous navigation of drones.

Description

technical field [0001] The invention relates to the technical field of autonomous navigation of unmanned aerial vehicles, in particular to the autonomous navigation of small unmanned aerial vehicles in a complex environment without GPS. Background technique [0002] At present, the commonly used navigation methods for micro-UAVs include inertial navigation, satellite navigation, and visual navigation. After the initial conditions are given, inertial navigation can realize completely autonomous navigation without relying on external signals or interference from the external environment, but there will be cumulative errors and inaccurate navigation information after long-term work. The most common satellite navigation is GPS navigation, which has the characteristics of globalization and high precision. However, GPS navigation is greatly affected by human factors, and shielding interruption or excessive dynamic error will occur indoors or when there are buildings blocking it. ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01C21/00G01C21/16G01C11/06
CPCG01C11/06G01C21/005G01C21/165
Inventor 白成超郭继峰张文苑
Owner HARBIN INST OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products