Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Pose estimation method based on RGB-D and IMU information fusion

A pose estimation and residual technology, applied in the field of pose estimation based on RGB-D and IMU information fusion

Active Publication Date: 2019-07-09
NORTHEASTERN UNIV
View PDF4 Cites 99 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Since this initialization method needs a certain period of time to converge the system scale, there will be certain problems for real-time systems, such as the positioning and navigation of drones.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Pose estimation method based on RGB-D and IMU information fusion
  • Pose estimation method based on RGB-D and IMU information fusion
  • Pose estimation method based on RGB-D and IMU information fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0111] The overall framework of a method for pose estimation based on RGB-D and IMU information fusion is as follows: figure 1 shown.

[0112] The pose estimation algorithm of the RGB-D and IMU information fusion pose estimation system (hereinafter referred to as the system) can be divided into four parts: data preprocessing, visual inertial navigation initialization, back-end optimization and loop detection. The four parts are taken as independent modules, which can be improved according to requirements.

[0113] Data preprocessing part: It is used to process the grayscale image and depth image collected by the RGB-D camera, as well as the acceleration and angular velocity information collected by the IMU. The input of this part contains grayscale image, depth image, IMU acceleration and angular velocity information, and the output contains adjacent matching feature points and IMU state increment.

[0114] Specifically: since the camera and the IMU have two clock sources,...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a pose estimation method based on RGB-D and IMU information fusion. The method comprises the following steps: S1, after time synchronization of RGB-D camera data and IMU data, the gray image and depth image acquired by the RGB-D camera and the acceleration and angular velocity information collected by the IMU are preprocessed to obtain the characteristics of adjacent frame matching in the world coordinate system. Point and IMU state increment; S2, a visual inertial device in a system is initialized according to system external parameters of a pose estimation system; S3,according to information of the intilized visual inertial device, feature points matching adjacent frames in a global corrdinate system, least squares optimization functions of an IMU state incrementconstruction system; an optimization method is used to iteratively solve the optimal solution of the least squares optimization function, and the optimal solution is used as the pose estimation statequantity; further, loop detection is performed to acquire globally-consistent pose estimation state quantity. Therefore, feature point depth estimation is more accurate, and positioning precision of the system is improved.

Description

technical field [0001] The invention relates to multi-sensor fusion technology, in particular to a pose estimation method based on RGB-D and IMU information fusion. Background technique [0002] The pose estimation technology of multi-sensor information fusion refers to the combination of data acquired by different sensors in similar time periods, and the use of relevant algorithms for data combination and complementary advantages, so as to obtain more credible analysis results. Due to the low price of the camera and the characteristics of the rich information and the accurate integration of the inertial measurement unit in a short time, the fusion of the camera and the inertial measurement unit has gradually become a research hotspot. [0003] Current pose estimation techniques for camera and inertial measurement unit data fusion are mainly divided into two categories: filter-based methods and optimization-based methods. According to whether the image feature information i...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/46G06K9/62G06T7/246G06T7/269G06T7/73G01C21/16G01C21/18
CPCG06T7/246G06T7/269G06T7/73G01C21/18G01C21/165G06T2207/10024G06T2207/10028G06V20/10G06V10/44G06F18/251Y02T10/40
Inventor 张磊张华希罗小川郑国贤
Owner NORTHEASTERN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products