Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Automatic mapping robot mapping and finishing method based on visual marks

A visual marking and robot technology, applied in the field of navigation, can solve problems such as map offset, low positioning accuracy, and distortion, and achieve the effects of reducing mapping and positioning errors, flexible and convenient layout, and easy implementation

Inactive Publication Date: 2019-04-02
TONGJI UNIV
View PDF10 Cites 17 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In many indoor and outdoor scenarios, the GPS signal is weak, the installation cost of WiFi and UWB is high, and the positioning accuracy is not high, which makes the built map shift and distort
Constrained by the problems of drift, cumulative error, and high price in the current positioning scheme, the methods such as loopback detection and map matching adopted by traditional error elimination technologies have certain requirements on computing power and environmental characteristics. The demand for precise positioning and error correction makes the image acquisition process expensive, complex calculations, etc., and it is difficult to deal with specific scenarios (such as white walls, etc.)

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Automatic mapping robot mapping and finishing method based on visual marks
  • Automatic mapping robot mapping and finishing method based on visual marks
  • Automatic mapping robot mapping and finishing method based on visual marks

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0018] The present invention will be further described below in conjunction with the embodiments shown in the accompanying drawings.

[0019] The present invention proposes a visual marker-based automatic mapping robot mapping repair method, in which the automatic mapping robot obtains the precise positioning information from the artificial visual markers set at a certain interval multiple times during the mapping process, and reads the artificial The location information encoded in the visual marker is used to calculate the relative position between the artificial visual marker and the automatic mapping robot, and the error correction for the GPS positioning and dead reckoning of the automatic mapping robot is obtained. The method generally consists of three parts:

[0020] Reading artificial visual marks: capture and identify artificial visual marks (such as two-dimensional codes) arranged at positioning points (visual mark points) through forward-looking visual sensors, and...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an automatic mapping robot mapping and finishing method based on visual marks and belongs to the technical field of navigation. The method comprises the following steps: (1) detecting visual marks in a to-be-detected environment; reading localization information of corresponding visual marking points contained in the visual marks; based on angle point information of the visual marks, calculating the relative position and course of the automatic mapping robot relative to the visual marking points at the moment; and (2) obtaining the coordinates and the course of the automatic mapping robot according to the localization information of the visual marking points and the relative position of the automatic mapping robot relative to the visual marking points, and correctingand optimizing the mapping data having deviation and errors between the visual mark points. The method is low in cost and easy to implement, and can realize real-time accurate correction of the localization errors in the high-precision map acquisition process and real-time finishing and off-line optimization of the acquired map information, thereby omitting the requirement for loop detection in atraditional simultaneous localization and mapping process.

Description

technical field [0001] The invention belongs to the technical field of navigation, and relates to a method for constructing and trimming a map, in particular to a method for constructing and trimming a map by an automatic map-building robot. Background technique [0002] In recent years, with the gradual application of self-driving cars in some fields, the demand for high-precision map collection is increasing. In many indoor and outdoor scenarios, the GPS signal is weak, the installation cost of WiFi and UWB is high, and the positioning accuracy is not high, which makes the built map shift and distort. Constrained by the problems of drift, cumulative error, and high price in the current positioning scheme, the methods such as loopback detection and map matching adopted by traditional error elimination technologies have certain requirements on computing power and environmental characteristics. The demand for accurate positioning and error correction makes the image acquisit...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/32G01S17/89
CPCG01C21/32G01S17/89
Inventor 陈广王法陈凯余卓平瞿三清葛艺忻卢凡
Owner TONGJI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products