Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Positioning method and system based on visual inertial navigation information fusion

A vision and depth vision technology, applied in the field of sensor fusion, can solve problems such as difficult operation, low algorithm robustness, and inability to handle loopbacks, etc., and achieve the effects of easy use, convenient assembly and disassembly, and cost reduction

Active Publication Date: 2018-04-03
NORTHEASTERN UNIV
View PDF8 Cites 147 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The filter-based method, regardless of loose coupling or tight coupling, has a common problem that it cannot effectively eliminate the cumulative error and cannot deal with the loopback problem.
[0004] For this reason, an optimization-based method is proposed in the industry. The optimization-based method can solve the above-mentioned accumulated errors and defects that cannot handle loopbacks. However, the robustness of the algorithm is relatively low and cannot be popularized.
[0005] In particular, the external parameter calibration in the current optimization-based method mainly uses the kalibr toolbox, and using the kalibr toolbox can only be calibrated offline and requires a special calibration board, which is not easy to operate and cannot be used directly online

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Positioning method and system based on visual inertial navigation information fusion
  • Positioning method and system based on visual inertial navigation information fusion
  • Positioning method and system based on visual inertial navigation information fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0065] In order to better explain the present invention and facilitate understanding, the present invention will be described in detail below through specific embodiments in conjunction with the accompanying drawings.

[0066] In the following description, various aspects of the present invention will be described. However, those skilled in the art can implement the present invention by using only some or all of the structures or processes of the present invention. For clarity of explanation, specific numbers, arrangements and sequences are set forth, but it will be apparent that the invention may be practiced without these specific details. In other instances, well-known features have not been described in detail in order not to obscure the invention.

[0067] At present, the method based on nonlinear optimization calculates the measurement residual error of inertial navigation and the reprojection error of visual sensor separately, and obtains the optimal estimation of the s...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a positioning method and system based on visual inertial navigation information fusion. The method comprises steps as follows: acquired sensor information is preprocessed, wherein the sensor information comprises an RGB image and depth image information of a depth vision sensor and IMU (inertial measurement unit) data; external parameters of a system which the depth visionsensor and an IMU belong to are acquired; the pre-processed sensor information and external parameters are processed with an IMU pre-integration model and a depth camera model, and pose information isacquired; the pose information is corrected on the basis of a loop detection mode, and the corrected globally uniform pose information is acquired. The method has good robustness in the positioning process and the positioning accuracy is improved.

Description

technical field [0001] The invention relates to sensor fusion technology, in particular to a positioning method and system based on visual inertial navigation information fusion. Background technique [0002] At present, visual inertial navigation fusion technology is widely used in 3D reconstruction, positioning and navigation of unmanned vehicles and drones, and automatic driving, aiming to provide real-time robust and accurate position and attitude. The mainstream visual inertial navigation fusion technology is based on the filter method. The filter-based method mainly uses the Kalman filter and its variants. The inertial navigation information is obtained according to the kinematic model of the inertial measurement unit to obtain the prior distribution of the system state vector, and the Kalman gain is updated by using the observation model of the visual sensor to obtain The posterior distribution of the system state vector. In the specific processing, according to whe...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/16G01C21/20
CPCG01C21/165G01C21/20
Inventor 刘腾飞张鹏
Owner NORTHEASTERN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products