Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Mobile robot positioning method and device based on particle filter and vision assistance

A mobile robot and particle filter technology, applied in the field of robotics and navigation, can solve the problems of low efficiency, large global map, and random particle distribution failure recovery time, and achieve the effect of reducing time, improving efficiency, and increasing accuracy

Inactive Publication Date: 2020-12-18
SICHUAN CHANGHONG ELECTRIC CO LTD
View PDF7 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] In order to solve the problems existing in the prior art, the object of the present invention is to provide a mobile robot positioning method and device based on particle filters and visual aids, which provide global positioning information based on visual sensors, and are used to solve the existing problems in the current technology. Due to factors such as large maps and difficulty in accurately distributing random particles in the real position of the robot, failure recovery takes too long and the efficiency is low

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Mobile robot positioning method and device based on particle filter and vision assistance
  • Mobile robot positioning method and device based on particle filter and vision assistance
  • Mobile robot positioning method and device based on particle filter and vision assistance

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0039] Such as figure 2 As shown, a mobile robot localization method based on particle filters and visual aids, including:

[0040] Step 1. Build an environmental map based on the laser sensor, and build an image key frame database based on the visual sensor. The image key frame database stores all image key frame information, and the image key frame information includes at least the global position information and depth information of the image key frame. Depth information be dense or sparse.

[0041] To construct an environmental map based on laser sensors, algorithms such as Gmapping, HectorSLAM, and Cartographer can be used to construct a two-dimensional grid map.

[0042] The method for establishing an image key frame database based on a visual sensor at least includes: fusing an IMU with a depth camera or a binocular camera or a monocular camera to obtain global position information and depth information of the image key frame. The usual practice is to use the SLAM al...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a mobile robot positioning method based on a particle filter and visual assistance, and the method comprises the steps: building an environment map based on a laser sensor, building an image key frame database based on a visual sensor, and enabling the image key frame database to store all image key frame information; calculating the positioning quality of the particle filter, and judging whether visual assistance is started or not; when visual assistance is started, calculating global position information of the mobile robot according to the current image frame and theimage key frame database, and setting an area near the global position of the mobile robot as a visual assistance scatter area; and iteratively updating the particle filter. The invention further discloses a mobile robot positioning device based on the particle filter and visual assistance, global positioning information is provided based on the visual sensor, the accuracy of point scattering inthe particle updating process is improved, the time for failure recovery is shortened, and the efficiency of failure recovery is improved.

Description

technical field [0001] The invention relates to the technical field of robots and navigation, in particular to a mobile robot positioning method and device based on particle filters and visual aids. Background technique [0002] With the rapid development of robot-related technologies, people's demand for robots is getting higher and higher, especially the autonomous navigation function of robots. The automatic navigation system must first load the grid map generated by the mapping system, and realize the robot's self-positioning and navigation on the map. Therefore, whether the robot starts to start the automatic navigation system or restarts after the automatic navigation system fails, the robot needs to quickly locate its initial position so that the automatic navigation system can take effect quickly. [0003] Laser-based robot self-localization algorithms usually use particle filter algorithms, and the corresponding module in the robot operating system ROS (Robot Opera...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/73G06T7/50G06F16/583G06N3/00
CPCG06N3/006G06T7/50G06T7/73G06F16/583
Inventor 刘孟红
Owner SICHUAN CHANGHONG ELECTRIC CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products