Whole-process pose estimation method based on global map and multi-sensor information fusion

A multi-sensor, pose estimation technology, applied in the field of navigation, can solve the problems of inaccurate results, unreliable result accuracy, sensitive filtering algorithm time synchronization, etc., and achieve high precision and robustness

Active Publication Date: 2021-09-07
TSINGHUA UNIV
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the filtering algorithm needs relatively accurate initial value prediction, and the filtering algorithm is sensitive to time synchronization. The late measurement will make the whole result inaccurate, and the optimization-based method will get more effective and accurate results.
In 2018, the Hong Kong University of Science and Technology proposed a multi-sensor fusion framework, which uses the output of VINS-MONO (a robust and general-purpose monocular visual-inertial state estimator) to fuse with the output of sensors such as GPS and magnetometer. The graph optimization method obtains the pose estimation result of the unmanned system, but this method does not consider the alignment problem between the global coordinate system and the local coordinate system in the whole process, and in the case of some sensor failures, the accuracy of the result not reliable enough

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Whole-process pose estimation method based on global map and multi-sensor information fusion
  • Whole-process pose estimation method based on global map and multi-sensor information fusion
  • Whole-process pose estimation method based on global map and multi-sensor information fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0058] The present invention proposes a whole-process pose estimation method based on global map and multi-sensor information fusion. The technical solution of the present invention will be further described below in conjunction with the accompanying drawings and specific embodiments.

[0059] The present invention proposes a whole-process pose estimation method based on global map and multi-sensor information fusion. The overall process is as follows: figure 1 shown, including the following steps:

[0060] 1) Build a UAV system including various sensors. The specific method is: select a UAV (conventional model can be used), and install various sensors on the UAV, including: VIO (Visual Inertial Odometer) system, GPS, magnetometer, and barometer; wherein, the VIO system includes a camera and an IMU (inertial measurement unit), the direction of the camera lens is consistent with the forward direction of the drone, and the VIO system, GPS magnetometer, and barometer can be integ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention proposes a whole-process pose estimation method based on global map and multi-sensor information fusion, which relates to the field of navigation. This method first builds a UAV system including various sensors; calibrates the sensors, obtains the corresponding parameters of each sensor, and initializes the UAV system; uses each sensor to obtain the measurement of the current position of the carrier UAV Information, and use the image information of the visual inertial odometer VIO system to construct and maintain a local map; build a multi-sensor information fusion framework based on the factor graph, and use the factor graph optimization to obtain the optimal state of the UAV system corresponding to each current frame of the VIO system variable, and update the conversion relationship between the local coordinate system and the global coordinate system in the current frame, and convert the local map into a global map. The present invention can use the global optimization method to fuse the measurement of all the sensors carried by the UAV and the global map information, so as to improve the accuracy and reliability of the pose estimation of the UAV system.

Description

technical field [0001] The invention relates to the field of navigation, in particular to a whole-process pose estimation method based on global map and multi-sensor information fusion. Background technique [0002] In the fields of autonomous driving, search and rescue, and investigation, the demand for unmanned systems is increasing, and the positioning of unmanned systems is the basis for it. At present, there are many technologies for local pose estimation of unmanned systems using sensors such as cameras and lidars, and there are also integrated navigation systems combined with IMU (inertial measurement unit), which can realize unmanned systems in local areas. System-accurate pose estimation. For example, the LSD-SLAM (Large-scale monocular real-time localization and map construction method based on the direct method) proposed by the Technical University of Munich in 2014 is based on the direct method to realize the pose determination and map construction in a large-sc...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/70G06T7/80
CPCG06T7/70G06T7/80
Inventor 孟子阳郝运
Owner TSINGHUA UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products