Visual navigation/inertial navigation full combination method

A technology of visual navigation and inertial navigation, which is applied in the field of full combination of visual navigation/inertial navigation, which can solve the problems that the combination effect cannot be optimal, and the prior information of the object space coordinates of image feature points is not considered.

Inactive Publication Date: 2013-12-04
TONGJI UNIV
View PDF4 Cites 87 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The fusion type combination method can make better use of the information of the visual system and the inertial navigation system, so that both can be corrected, but this method does not take into account the prior information of the object coordinates of the image feature points, and the combination effect cannot achieve the best results. good

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual navigation/inertial navigation full combination method
  • Visual navigation/inertial navigation full combination method
  • Visual navigation/inertial navigation full combination method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0085] Such as figure 1 As shown, a fully combined method of visual navigation / inertial navigation, the method includes the following steps:

[0086] 1) Visual navigation solution

[0087] Proceed as follows:

[0088] (1) Initialization of the camera's outer orientation elements. Generally, some control points are arranged around the starting position of the visual navigation, and the camera's outer orientation elements are obtained through resection.

[0089] (2) The carrier is constantly moving, and the stereo camera is used to obtain the image at the current moment for feature point extraction and matching. The feature point matching includes two aspects: one is to match the feature points of the left and right images at the current moment, and the other is to match the current The left image at the moment is matched with the left image at the previous moment.

[0090] (3) Based on the collinear equation, the observation equation is established for all the feature points...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a visual navigation / inertial navigation full combination method. The method comprises the following steps: first, calculation of visual navigation: observation equations are listed based on collinearity equations, carrier positions and attitude parameters are obtained through the least square principle and adjustment,, and variance-covariance arrays among the parameters are calculated; second, calculation of inertial navigation: navigation calculation is carried out in the local horizontal coordinates, carrier positions, speeds and attitude parameters of each moment are obtained, variance-covariance arrays among the parameters are calculated; third, correction of the inertial navigation system through the visual system: by means of the Kalman filtering, navigation parameter errors and device errors of the inertial navigation system are estimated, and subjected to compensation and feedback correction, and therefore the optimal estimated values of all the parameters of the inertial navigation system are obtained; fourth, correction of the visual system through the inertial navigation system: all the parameters of the visual system are corrected through the sequential adjustment treatment. Compared to the prior art, the method has advantages of rigorous theories, stable performances, high efficiency and the like.

Description

technical field [0001] The invention relates to a combination method of a visual navigation system and an inertial navigation system, in particular to a fully combined method of visual navigation / inertial navigation. Background technique [0002] Visual navigation, also known as Visual Odometry (VO), the basic principle is: during the forward process of the carrier, use the stereo camera to obtain the images of the surrounding scenes, rely on the matching of feature points between the left and right images, and the distance between the front and rear frame images. The tracking of the camera is based on the collinear equation and the observation equation is established, and the spatial position and attitude of the camera are calculated according to the least squares principle adjustment. Visual navigation is a high-precision autonomous navigation method, but as a dead reckoning method, its navigation error accumulates over time, and the stability of the algorithm has always b...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/16G01C21/00
Inventor 刘春周发根吴杭彬姚连壁简志伟
Owner TONGJI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products