Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Whole-course pose estimation method based on global map and multi-sensor information fusion

A global map and multi-sensor technology, applied in the field of navigation, can solve the problems of time synchronization sensitivity of filtering algorithm, unreliable and inaccurate results, etc.

Active Publication Date: 2020-01-17
TSINGHUA UNIV
View PDF5 Cites 33 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the filtering algorithm needs relatively accurate initial value prediction, and the filtering algorithm is sensitive to time synchronization. The late measurement will make the whole result inaccurate, and the optimization-based method will get more effective and accurate results.
In 2018, the Hong Kong University of Science and Technology proposed a multi-sensor fusion framework, which uses the output of VINS-MONO (a robust and general-purpose monocular visual-inertial state estimator) to fuse with the output of sensors such as GPS and magnetometer. The graph optimization method obtains the pose estimation result of the unmanned system, but this method does not consider the alignment problem between the global coordinate system and the local coordinate system in the whole process, and in the case of some sensor failures, the accuracy of the result not reliable enough

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Whole-course pose estimation method based on global map and multi-sensor information fusion
  • Whole-course pose estimation method based on global map and multi-sensor information fusion
  • Whole-course pose estimation method based on global map and multi-sensor information fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0058] The present invention proposes a whole-process pose estimation method based on global map and multi-sensor information fusion. The technical solution of the present invention will be further described below in conjunction with the accompanying drawings and specific embodiments.

[0059] The present invention proposes a whole-process pose estimation method based on global map and multi-sensor information fusion. The overall process is as follows: figure 1 shown, including the following steps:

[0060] 1) Build a UAV system including various sensors. The specific method is: select a UAV (conventional model can be used), and install various sensors on the UAV, including: VIO (Visual Inertial Odometer) system, GPS, magnetometer, and barometer; wherein, the VIO system includes a camera and an IMU (inertial measurement unit), the direction of the camera lens is consistent with the forward direction of the drone, and the VIO system, GPS magnetometer, and barometer can be integ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a whole-course pose estimation method based on global map and multi-sensor information fusion, and relates to the field of navigation. The method comprises: firstly, building anunmanned aerial vehicle system comprising sensors; calibrating the sensors to obtain corresponding parameters of each sensor, and initializing an unmanned aerial vehicle system; acquiring measurementinformation of the current pose of the carrier unmanned aerial vehicle by utilizing each sensor, and constructing and maintaining a local map by utilizing image information of a visual inertia odometer VIO system; and constructing a factor graph-based multi-sensor information fusion framework, optimizing by utilizing the factor graph to obtain an optimal state variable of each current frame of the VIO system corresponding to the unmanned aerial vehicle system, updating a conversion relationship between a local coordinate system and a global coordinate system under the current frame, and converting a local map into a global map. Measurement of all sensors carried by the unmanned aerial vehicle and global map information can be fused by using a global optimization mode so that accuracy andreliability of pose estimation of the unmanned aerial vehicle system can be enhanced.

Description

technical field [0001] The invention relates to the field of navigation, in particular to a whole-process pose estimation method based on global map and multi-sensor information fusion. Background technique [0002] In the fields of autonomous driving, search and rescue, and investigation, the demand for unmanned systems is increasing, and the positioning of unmanned systems is the basis for it. At present, there are many technologies for local pose estimation of unmanned systems using sensors such as cameras and lidars, and there are also integrated navigation systems combined with IMU (inertial measurement unit), which can realize unmanned systems in local areas. System-accurate pose estimation. For example, the LSD-SLAM (Large-scale monocular real-time localization and map construction method based on the direct method) proposed by the Technical University of Munich in 2014 is based on the direct method to realize the pose determination and map construction in a large-sc...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/70G06T7/80
CPCG06T7/70G06T7/80
Inventor 孟子阳郝运
Owner TSINGHUA UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products