Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Mobile robot pose estimation method and system based on multi-sensor tight coupling

A mobile robot, multi-sensor technology, applied in instruments, computing, computer parts, etc., can solve the problems of low positioning accuracy and poor robustness, and achieve the effect of improving robustness, accuracy and robustness

Active Publication Date: 2021-09-24
HUAZHONG UNIV OF SCI & TECH
View PDF8 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In view of the above defects or improvement needs of the prior art, the present invention provides a mobile robot pose estimation method and system based on multi-sensor tight coupling, thereby solving the problem of existing robot positioning technology in fast movement and different complex environments. Technical issues of low precision and poor robustness

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Mobile robot pose estimation method and system based on multi-sensor tight coupling
  • Mobile robot pose estimation method and system based on multi-sensor tight coupling
  • Mobile robot pose estimation method and system based on multi-sensor tight coupling

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0050] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention. In addition, the technical features involved in the various embodiments of the present invention described below can be combined with each other as long as they do not constitute a conflict with each other.

[0051] A mobile robot pose estimation method based on multi-sensor tight coupling, the multi-sensor includes camera, IMU and laser radar, the method includes:

[0052] Perform feature matching on the current frame RGB-D image collected by the camera and the previous frame RGB-D image, and calculate the visual reprojection error of the matching process;

[0053] Integrate the data measured by th...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a mobile robot pose estimation method and system based on multi-sensor tight coupling, and belongs to the technical field of robot localization, and the method comprises the steps of carrying out the feature matching of a current frame RGB-D image collected by a camera and a previous frame RGB-D image, and calculating a visual re-projection error in the matching process; integrating the data measured by the IMU, and constructing an IMU integral residual error; extracting edge feature points and plane feature points from the point cloud collected by the laser radar, calculating the distance from the edge feature points to the edge line and the distance from the plane feature points to the plane, and constructing a laser point cloud geometric residual error; performing pose estimation by taking the minimum visual re-projection error, IMU integral residual error and laser point cloud geometric residual error as targets to obtain a local pose; and updating the laser point cloud map by using the local pose, and performing global optimization on the laser point cloud map to obtain a global pose. According to the invention, the positioning precision and robustness of the mobile robot in complex motion and complex environment are improved.

Description

technical field [0001] The invention belongs to the technical field of robot positioning, and more particularly relates to a mobile robot pose estimation method and system based on multi-sensor tight coupling. Background technique [0002] In the field of mobile robots, the fusion of one or more of visual sensors, IMUs, and lidars is often used as the actual SLAM application algorithm. The visual SLAM method using a camera (monocular, binocular or depth) as a single sensor is greatly affected by ambient lighting, and it is difficult to detect effective features in an environment with sparse textures, resulting in missing positioning. The laser SLAM method using lidar (2D or 3D) as a single sensor has a low measurement frequency, and the lidar has motion distortion that is difficult to correct when the motion speed changes greatly. The IMU sensor performs motion estimation based on acceleration and angular velocity integration, and can provide more accurate motion estimation...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/73G06T5/00G06T17/05G06K9/46G06K9/62G06F17/18G06F17/15
CPCG06T7/73G06T17/05G06F17/18G06F17/15G06T2207/10028G06T2207/10044G06T5/80
Inventor 彭刚陈博成周奕成彭嘉悉
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products