Mobile robot positioning method and system based on three-dimensional point cloud and vision fusion

A technology of mobile robot and three-dimensional point cloud, which is applied in the direction of radio wave measurement system, instrument, electromagnetic wave re-radiation, etc. It can solve the problems of difficult positioning, avoid sparse outdoor feature scenes, and realize the effect of high-precision mobile robot positioning

Active Publication Date: 2020-07-17
SHANGHAI JIAO TONG UNIV
View PDF9 Cites 19 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the existing point cloud map solutions need to store a large amount of point cloud data and match the real-time point cloud with the entire map.
However, visual SLAM construction maps are difficult to achieve accurate positioning in outdoor scenes with sparse features.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Mobile robot positioning method and system based on three-dimensional point cloud and vision fusion
  • Mobile robot positioning method and system based on three-dimensional point cloud and vision fusion
  • Mobile robot positioning method and system based on three-dimensional point cloud and vision fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0055] The present invention will be described in detail below in conjunction with specific embodiments. The following examples will help those skilled in the art to further understand the present invention, but do not limit the present invention in any form. It should be noted that those skilled in the art can make several changes and improvements without departing from the concept of the present invention. These all belong to the protection scope of the present invention.

[0056] Aiming at the defects in the prior art, the present invention provides a combination of laser point cloud features and image features to form a joint feature, which is matched with a feature grid map for positioning, respectively involving the establishment of an environmental map, point cloud and visual feature fusion, and map matching , wherein, the environment map is to establish the feature grid map of the environment, each grid in the feature grid map stores a point set consisting of feature ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a mobile robot positioning method and system based on three-dimensional point cloud and visual fusion, and the method comprises the steps: building an environment map, carryingout the fusion of point cloud and visual features, matching a map, and carrying out the positioning. Wherein the environment map is a feature grid map for establishing an environment, each grid in thefeature grid map stores a point set composed of feature points extracted from the point cloud and feature points of the visual image, and a height value, an intensity value and a normal vector projection value are extracted; the point cloud and visual feature fusion is to project feature points extracted from the image to a point cloud space and form joint feature points with point cloud features. According to map matching and positioning, joint feature points are projected to a two-dimensional grid, feature vectors are extracted, the feature grid is matched with the map, a histogram filter is adopted to determine the posterior probability of each candidate pose, and the position of the robot in the map is determined based on the posterior probabilities.

Description

technical field [0001] The present invention relates to the technical field of mobile robot positioning and navigation, in particular to a mobile robot positioning method and system based on three-dimensional point cloud and vision fusion, especially to a feature grid map matching positioning method for mobile robots based on multi-sensor fusion. Background technique [0002] Mobile robots usually have the function of autonomous positioning and navigation. They need to complete the construction of an environmental map and achieve high-precision positioning based on the constructed map. The positioning problem is a key issue in the field of robotics. [0003] Localization systems play a pivotal role in autonomous vehicles. Other modules, such as perception and path planning, perform corresponding operations based on the positioning results generated by the positioning system to varying degrees. The accuracy of positioning is one of the keys that directly affects the success ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T17/05G06T19/00G06T7/73G06T3/00G01S17/89
CPCG06T17/05G06T3/005G06T7/73G01S17/89G06T19/006G06T2207/10028
Inventor 王贺升赵小文
Owner SHANGHAI JIAO TONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products