Robot pose estimation method and system based on multi-sensor feature fusion

A technology of feature fusion and pose estimation, applied in the field of robotics, can solve problems such as low accuracy rate, complex calculation and low efficiency, and achieve the effect of solving low accuracy rate and accurate pose estimation

Pending Publication Date: 2022-01-11
SHENZHEN POWER SUPPLY BUREAU
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] The technical problem to be solved by the embodiments of the present invention is to provide a robot pose estimation method and system based on multi-sensor feature fusion, which can solve the problems of low accuracy of pose estimation using a single sensor and complex calculation and low efficiency of traditional multi-sensor fusion algorithms. question

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot pose estimation method and system based on multi-sensor feature fusion
  • Robot pose estimation method and system based on multi-sensor feature fusion
  • Robot pose estimation method and system based on multi-sensor feature fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings.

[0030] Such as figure 1 As shown, in the embodiment of the present invention, a robot pose estimation system based on multi-sensor feature fusion is proposed, and the method includes the following steps:

[0031] Step S1, acquiring two consecutive frames of RGB images captured when the robot is in motion, and the inertial sensor information between the two consecutive frames of RGB images;

[0032] Step S2, after preprocessing the two consecutive frames of RGB images, cascade input into the predefined image feature extraction network, output the image features, and input the inertial sensor information between the two consecutive frames of RGB images into the predefined In the inertial feature extraction network, the output is the inertial feature;

[0033] Step...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a robot pose estimation method based on multi-sensor feature fusion, and the method comprises the steps: obtaining two frames of continuous RGB images captured when a robot moves, and inertial sensor information between the two frames of continuous RGB images; preprocessing the two frames of continuous RGB images, cascading and inputting the two frames of continuous RGB images into a predefined image feature extraction network, outputting to obtain image features, inputting inertial sensor information between the two frames of continuous RGB images into a predefined inertial feature extraction network, and outputting to obtain inertial features; and cascading the obtained image features and inertial features, inputting the cascaded image features and inertial features into a predefined multi-sensor fusion neural network for feature fusion to obtain fused features, and obtaining multi-degree-of-freedom pose estimation of the robot according to the fused features. By implementing the method, the problems of low accuracy of pose estimation by using a single sensor and complex calculation and low efficiency of a traditional multi-sensor fusion algorithm can be solved.

Description

technical field [0001] The present invention relates to the field of robot technology, computer image processing technology and visual and inertial odometer field, in particular to a robot pose estimation method and system based on multi-sensor feature fusion. Background technique [0002] Simultaneous localization and mapping (SLAM) has developed rapidly in recent years, and has received more and more attention and applications in various scenarios such as scientific research, industrial production, and daily life. [0003] As the front end of SLAM, odometry technology can be used to estimate the pose of the robot. Therefore, an excellent odometry technology can provide high-quality initial values ​​for the back-end and global map construction of SLAM, so that the robot can perform various tasks accurately and autonomously in complex unknown environments. Typical odometry solutions mainly utilize vision sensors to recover the robot's motion pose from a series of image stre...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/246G06T7/73G06V10/774G06V10/80G06V10/82G06K9/62G06N3/04G06N3/08
CPCG06T7/246G06T7/73G06N3/08G06T2207/20081G06T2207/20084G06T2207/30244G06N3/045G06F18/214G06F18/253
Inventor 徐曙陈潇张成巍王成皓
Owner SHENZHEN POWER SUPPLY BUREAU
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products