Inertial vision integrated navigation method based on optical flow method

A technology of combined navigation and optical flow method, applied in navigation, mapping and navigation, and navigation through speed/acceleration measurement, etc. The effect of improving adaptability

Active Publication Date: 2019-03-29
HARBIN INST OF TECH
View PDF7 Cites 32 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to solve the inaccurate navigation information of the existing inertial navigation; when the existing satellite navigation is indoors or blocked by buildings, shielding interruption or excessive dynamic error will occ

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Inertial vision integrated navigation method based on optical flow method
  • Inertial vision integrated navigation method based on optical flow method
  • Inertial vision integrated navigation method based on optical flow method

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0021] Specific implementation mode 1: The specific process of this embodiment is: the specific process of an inertial vision integrated navigation method based on the optical flow method is:

[0022] Step 1. Define the world coordinate system O w X w Y w Z w , UAV body coordinate system O b X b Y b Z b , Camera coordinate system O c X c Y c Z c , Physical imaging coordinate system O 1 xy and pixel image plane coordinate system Ouv;

[0023] Step 2. Mount the three types of sensors: IMU, camera and altimeter on the UAV, collect images through the camera, and follow the figure 2 The process of performing pyramid LK calculation to obtain two-dimensional optical flow data;

[0024] IMU consists of gyroscope and accelerometer;

[0025] Step 3: Convert the two-dimensional optical flow data information obtained in step two into three-dimensional navigation information, that is, the position of the drone in the world coordinate system;

[0026] Step 4. Perform inertial navigation based on IMU ...

Example Embodiment

[0031] Specific embodiment two: this embodiment is different from specific embodiment one in that the world coordinate system O is defined in the step one w X w Y w Z w , UAV body coordinate system O b X b Y b Z b , Camera coordinate system O c X c Y c Z c , Physical imaging coordinate system O 1 xy and pixel image plane coordinate system Ouv; specifically:

[0032] a. World coordinate system O w X w Y w Z w :

[0033] Use the North-East-Earth coordinate system as the world coordinate system, the origin of the world coordinate system O w Is the projection of the initial position of the drone on the ground, coordinate axis O w X w Point to the north of the earth, O w Y w Point to the east of the earth, O w Z w It is perpendicular to the surface of the earth and points downward; the world coordinate system is a fixed coordinate system;

[0034] b. UAV body coordinate system O b X b Y b Z b :

[0035] Origin of body coordinate system O b Take it on the center of mass of the drone, O b X b...

Example Embodiment

[0043] Specific embodiment three: This embodiment is different from specific embodiments one or two in that in the second step, three sensors of IMU, camera and altimeter are mounted on the drone, and images are collected by the camera, and the collected images are as follows figure 2 The process of performing pyramid LK calculation to obtain two-dimensional optical flow data; the specific process is:

[0044] Step 2-1: Convert the two frames before and after the color image collected by the camera into grayscale images;

[0045] Step 2-2. Use Shi-Tomasi corner detection method to find a certain number of feature points in the previous frame, and accurate the feature point coordinates of the previous frame to sub-pixel accuracy;

[0046] Step 2-3: Use the LK algorithm of the pyramid idea to detect the position of the feature point identified in the previous frame in the next frame and determine the coordinate of the feature point in the next frame;

[0047] Steps 2-4. Finally, accordin...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an inertial vision integrated navigation method based on an optical flow method, relates to an inertial vision integrated navigation method, and has the purpose of solving theproblems that the navigation information is inaccurate due to cumulative errors caused by the existing inertial navigation working for a long time, a mask interruption or an excessive dynamic error may happen when the existing satellite navigation is indoor or shielded by a building, and the algorithm of the existing vision navigation is complex and susceptible to camera pose, illumination changes, image noise and the like. The method comprises the following processes: 1, defining a coordinate system; 2, carrying three sensors of an IMU (Inertial Measurement Unit), a camera and an altimeter onthe drone to obtain two-dimensional optical flow data; 3, obtaining the position of the drone in the world coordinate system; 4, performing inertial navigation according to the IMU measuring information to calculate the position and attitude of the drone in the world coordinate system; and 5, obtaining the position and attitude information of the merged drone in the world coordinate system. The method is used in the technical field of autonomous navigation of drones.

Description

technical field [0001] The invention relates to the technical field of autonomous navigation of unmanned aerial vehicles, in particular to the autonomous navigation of small unmanned aerial vehicles in a complex environment without GPS. Background technique [0002] At present, the commonly used navigation methods for micro-UAVs include inertial navigation, satellite navigation, and visual navigation. After the initial conditions are given, inertial navigation can realize completely autonomous navigation without relying on external signals or interference from the external environment, but there will be cumulative errors and inaccurate navigation information after long-term work. The most common satellite navigation is GPS navigation, which has the characteristics of globalization and high precision. However, GPS navigation is greatly affected by human factors, and shielding interruption or excessive dynamic error will occur indoors or when there are buildings blocking it. ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G01C21/00G01C21/16G01C11/06
CPCG01C11/06G01C21/005G01C21/165
Inventor 白成超郭继峰张文苑
Owner HARBIN INST OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products