Monocular robustness visual-inertial tight coupling localization method

A positioning method and tightly coupled technology, applied in navigation, instrumentation, mapping and navigation, etc., can solve the problems of inability to achieve real-time robust positioning, low initialization accuracy, and poor positioning effect.

Active Publication Date: 2019-07-19
SOUTHEAST UNIV
View PDF5 Cites 17 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The OKVIS framework (Leutenegger S, Furgale P, Rabaud V, et al. Keyframe-based visual-inertialslam using nonlinear optimization [J]. Proceedings of Robotis Science and Systems (RSS) 2013, 2013.) proposed the front end of the visual-inertial fusion positioning framework model, which realizes the tight coupling of visual and inertial data, but does not recover data such as system scale and gravitational acceleration, and does not contain a pre-integration framework, so its positioning accuracy and robustness are poor; based on ORB_SLAM2 (Mur-Artal R,Tardós J D.Visual-inertialmonocular SLAM with map reuse[J].IEEE Robotics and Automation Letters,2017,2(2):796-803.) Monocular visual-inertial positioning system, its pre-integration model uses flow pattern The pre-

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Monocular robustness visual-inertial tight coupling localization method
  • Monocular robustness visual-inertial tight coupling localization method
  • Monocular robustness visual-inertial tight coupling localization method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0190] The method proposed by the present invention can theoretically be applied to the existing traditional visual-inertial fusion positioning framework (VIO). The existing traditional visual-inertial fusion positioning framework includes two modules, the front-end and the back-end. The front-end estimates the camera motion between adjacent images through the IMU and images, and the back-end receives the camera motion information estimated by the front-end at different times, and performs local and Global optimization to obtain globally consistent trajectories.

[0191] Existing VIOs include OKVIS, ORB_SLAM2-based monocular visual-inertial fusion positioning system, and VINS. Based on visual inertia ORB_SLAM2 (hereinafter referred to as origin_VIO), and using nine data sequences in the EuRoC dataset for testing. This dataset contains the dynamic motion of UAVs equipped with VI-Sensor binocular inertial cameras in different rooms and industrial environments. The image acquis...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a monocular robustness visual-inertial tight coupling localization method which comprises the steps of: acquiring visual data by a camera, and acquiring inertial data by an IMU(Inertial Measurement Unit); carrying out IMU pre-integration and obtaining an IMU priori value; substituting the IMU priori value into a visual-inertial combined initialization model, and completinginitialization of a parameter; and in time required for initializing the parameter, calculating motion information by utilizing a transformation matrix among continuous key frames, and substituting the motion information into a rear-end module of a visual-inertial fused localization framework so as to implement tight coupling localization. After initialization of the parameter is completed, the parameter is substituted into the visual-inertial fused localization framework, and the motion information is calculated and is substituted into the rear-end module of the visual-inertial fused localization framework so as to implement tight coupling localization. By using the method disclosed by the invention, initialization time can be shortened to be within 10 seconds, and compared to localization accuracy of a conventional ORB_SLAM2 (ORB_ Simultaneous Localization and Mapping)-based monocular visual-inertial localization system, localization accuracy of the monocular robustness visual-inertial tight coupling localization method can be improved by about 30%.

Description

technical field [0001] The invention relates to a monocular robust visual-inertial tightly coupled positioning method, which belongs to the field of SLAM (Simultaneous Localization and Mapping, simultaneous positioning and mapping). Background technique [0002] With the rapid development of technologies such as autonomous flight, automatic driving, virtual reality and augmented reality of micro air vehicles, the realization of high-precision and robust positioning is an important prerequisite for completing the established tasks of autonomous navigation of mobile agents and exploration of unknown areas. Fusion of visual sensor and inertial measurement unit (IMU) can construct a visual-inertial fusion positioning system (VIO) with higher accuracy and stronger robustness. [0003] The traditional visual-inertial fusion positioning framework includes two modules, the front-end and the back-end. The front-end estimates the camera motion between adjacent images through the IMU a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G01C21/16
CPCG01C21/165
Inventor 潘树国盛超曾攀黄砺枭王帅赵涛高旺
Owner SOUTHEAST UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products