Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Visual inertial navigation fusion SLAM method based on Runge-Kutta4 improved pre-integration

A pre-integration, visual technology, used in navigation, navigation, image analysis, etc. through velocity/acceleration measurement, which can solve problems such as poor performance, poor positioning accuracy and robustness

Pending Publication Date: 2021-01-19
XIDIAN UNIV +1
View PDF10 Cites 23 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] (2) Poor positioning accuracy and robustness under extreme conditions such as lack of environmental information and rapid carrier movement
It allows users not to wait too long, but also solves the problem of poor performance of the system in extreme environments such as lack of environmental information, light changes, and fast motion.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual inertial navigation fusion SLAM method based on Runge-Kutta4 improved pre-integration
  • Visual inertial navigation fusion SLAM method based on Runge-Kutta4 improved pre-integration
  • Visual inertial navigation fusion SLAM method based on Runge-Kutta4 improved pre-integration

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0091] In order to make the object, technical solution and advantages of the present invention more clear, the present invention will be further described in detail below in conjunction with the examples. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0092] Aiming at the problems existing in the prior art, the present invention provides a visual inertial navigation fusion SLAM method based on Runge-Kutta4 improved pre-integration, which provides IMU constraints between two key frames through IMU pre-integration, and then minimizes the visual The sum of the reprojection error and the IMU pre-integration error estimates the system state, and finally optimizes the system state and map point locations within the sliding window. In order to improve the precision of the pre-integration stage, the present invention innovatively proposes to use the fourth-order Runge-Kutta algo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of unmanned driving, discloses a visual inertial navigation fusion SLAM method based on Runge-Kutta4 improved pre-integration, and is used for solving thetechnical problems of low positioning precision, poor robustness and the like of an existing visual ORB-SLAM2 method in occasions of rapid movement, sparse environmental characteristics and the like.The method comprises the following steps: inputting binocular image pair information; inputting IMU information; preprocessing the binocular image; carrying out pre-integration on the IMU by utilizinga RungeKutta4 algorithm; initializing a system; estimating joint state; locally optimizing a sliding window; and carrying out loop detection and global pose graph optimization. According to the method, positioning estimation and map creation can be effectively carried out in scenes with different difficulty levels, and compared with an original visual ORB-SLAM2 method, the method has higher positioning precision and can be applied to the technical fields of unmanned system navigation, virtual reality and the like.

Description

technical field [0001] The invention belongs to the technical field of unmanned driving, and in particular relates to a visual inertial navigation fusion SLAM method based on Runge-Kutta4 improved pre-integration. Background technique [0002] At present: Unmanned systems such as drones and mobile robots with autonomous navigation functions are gradually being used in various fields. Among many key technologies, Simultaneous Localization And Mapping (SLAM) is the foundation and core of realizing the autonomous navigation function of unmanned systems. Visual SLAM relies on the image sequence captured by the camera to estimate the pose of the unmanned system and create a map of the environment, but when the unmanned system moves too fast or the environment features are missing, its accuracy and robustness will drop sharply, and sometimes even cause localization fail. With continuous advancements in hardware design and fabrication, low-cost lightweight MEMS IMUs have become u...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/16G01C21/20G06F17/11G06F17/16G06K9/46G06K9/62G06T7/50G06T7/62G06T7/80
CPCG01C21/165G01C21/20G06F17/16G06F17/11G06T7/80G06T7/62G06T7/50G06V10/462G06V10/757G06F18/214
Inventor 崔家山冯冬竹焦子涵付秋军赤丰华滕锐张舫瑞田淇臣
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products