Robot locating method adopting multi-sensor data fusion

A technology of robot positioning and data fusion, applied in the direction of instruments, re-radiation of electromagnetic waves, measuring devices, etc., can solve the problems that road sign features cannot be highly similar, mismatched, etc., to reduce mismatched results, reduce influence, and eliminate inherent constraints. Effect

Inactive Publication Date: 2016-11-16
HEFEI INSTITUTES OF PHYSICAL SCIENCE - CHINESE ACAD OF SCI +1
View PDF9 Cites 63 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the laser scanning matching method also has certain limitations. The scanned environment must contain mor

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot locating method adopting multi-sensor data fusion
  • Robot locating method adopting multi-sensor data fusion
  • Robot locating method adopting multi-sensor data fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0021] The technical solution of this patent will be further described in detail below in conjunction with specific embodiments.

[0022] see Figure 1-5 , a multi-sensor data fusion robot positioning method, the specific steps are as follows:

[0023] (1) According to the motion model of the mobile robot, calculate the robot position change (Δx t ,Δy t ), and the yaw angle change θ calculated by the inertial measurement unit t ;Calculate the robot pose estimate P at the current moment based on the change in pose and the pose data at the previous moment odom (t)=[x(t),y(t),θ(t)] T ;

[0024] (2) Let q 0 =(Δx t ,Δy t ,θ t ), according to the reference scan data S of lidar at time t-1 t-1 And the scan data S at time t t , put q 0 As an initial estimate of the pose change, execute the laser point cloud scanning matching algorithm, and iteratively calculate the pose transformation q k ; And according to this result, calculate the robot pose estimation P corresponding ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a robot locating method adopting multi-sensor data fusion. On the basis of laser scanning, combination of odometer displacement and a yaw angle of an inertial measurement unit is taken as initial estimation of pose change, then the current pose of a robot is calculated with a scan matching method, and Kalman filtering is performed finally on the basis of two pose calculation results of an odometer and scan matching. The robot locating method adopting multi-sensor data fusion has the advantages that inherent constraints of conventional odometer system location can be eliminated effectively, and influence of accumulated errors on a locating result is reduced; and mismatching results in a matching process can be reduced effectively; displacement change data of the odometer in a short time interval and the robot yaw angle provided by the inertial measurement unit are taken as initial estimation of a laser scan matching process, so that the speed and the accuracy rate of the scan matching process can be increased greatly, and a better matching result can be obtained finally.

Description

technical field [0001] The invention relates to the technical field of mobile robot positioning, in particular to a multi-sensor data fusion robot positioning method. Background technique [0002] For an autonomous mobile robot, one of the core capabilities he needs is to use sensors to perceive the surrounding environment. When robots are engaged in autonomous navigation tasks, real-time localization in the environment has always been a hot research problem. The traditional positioning method is to use the encoder installed on the robot wheel to combine the odometer model to calculate the current pose of the robot in real time. This method is simple and easy to use but has its inherent cumulative error. It is inevitable that the robot will slip when it moves on the ground, as well as the errors existing in the system design and manufacturing process. These errors will continue to accumulate as the robot moves. Therefore, this method has corresponding limitations. [0003...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G01C21/00G01C21/16G01S17/87
CPCG01C21/005G01C21/165G01S17/875
Inventor 赵江海徐群山方世辉方健何锋黄海卫
Owner HEFEI INSTITUTES OF PHYSICAL SCIENCE - CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products