Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Laser and vision-based hybrid location method for mobile robot

A mobile robot and hybrid positioning technology, applied in navigation computing tools and other directions, can solve the problems of ultrasonic accuracy, low environmental recognition, and high maintenance cost of topological maps, achieving diversity assurance, wide application range, and making up for visual positioning. unstable effect

Active Publication Date: 2016-08-17
SHEN ZHEN 3IROBOTICS CO LTD
View PDF4 Cites 58 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Probabilistic models are limited by memory and CPU. Absolute positioning models require some equipment such as: artificial beacons, or some special environments, such as indoors, are not suitable for GPS; there will be grid errors in grid maps, and the maintenance cost of topological maps is relatively high. It is difficult to recover afterwards; ultrasound has accuracy problems; the current normal laser radar is a two-dimensional laser radar, which has a low dimension and a low degree of recognition of the environment; visual positioning mainly includes feature-based positioning, positioning based on three-dimensional map matching, and positioning based on The positioning of three-dimensional point clusters, etc., the visual positioning algorithm is not very mature at present

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Laser and vision-based hybrid location method for mobile robot
  • Laser and vision-based hybrid location method for mobile robot
  • Laser and vision-based hybrid location method for mobile robot

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0023] The solution of the present invention will be further described in detail below in conjunction with the drawings and specific embodiments.

[0024] Such as figure 2 As shown, the mobile robot in this embodiment includes an odometer, a gyroscope, a lidar, a camera (vision sensor), a drive system, and a core control board (MCU). Specifically, the drive system in this embodiment consists of left and right It is composed of wheels and driven by different motors. It should be understood that the mobile robot can also include other parts (such as dust collection system, sensor system, alarm system, etc.), which are not related to the technical solution of the present invention. Description.

[0025] Such as figure 1 As shown, the positioning method based on the particle filter algorithm of the mobile robot of the present invention includes the following steps:

[0026] S1: Initialize the position of each particle and the grid map and visual feature map of each particle, and estab...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention discloses a laser and vision-based hybrid location method for a mobile robot. The mobile robot comprises a laser radar and a vision sensor. According to the technical scheme of the invention, the weight of each particle at a predicted position is updated based on the collected data of the laser radar and the collected data of the vision sensor. After that, particles of higher weights are re-sampled, so that the real location distribution of the mobile robot at the moment t can be obtained. Compared with the prior art, the above technical scheme integrates the high accuracy of the laser radar with the information integrity of the vision sensor, thus being wider in application range. Meanwhile, the defect that the visual location is unstable is overcome. In addition, in one embodiment of the present invention, a conventional particle filtering sampling model is improved, so that the diversity of particles is ensured.

Description

Technical field [0001] The invention relates to a positioning method of a mobile robot, in particular to a hybrid positioning method of a mobile robot based on laser and vision. Background technique [0002] With the continuous development of mobile robots, simultaneous positioning and mapping (SLAM) has become a major proposition in the development of mobile robots. There are many ways to solve this problem. From the model, it is divided into probabilistic positioning model (such as particle filter) and absolute positioning model (such as GPS), etc.; from the map, there are raster maps, topological maps, etc.; from sensors The angle is divided into ultrasonic positioning, lidar positioning, visual positioning and so on. [0003] Probabilistic model memory and CPU limitations, absolute positioning model requires some equipment such as artificial beacons, or some special environments, such as indoors, which are not suitable for GPS; grid maps will have grid errors, and the maintena...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/20
CPCG01C21/20
Inventor 邓龙李崇国杨勇宫海涛
Owner SHEN ZHEN 3IROBOTICS CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products