Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Cooperative localization method based on multi-modal map

A collaborative positioning and multi-modal technology, applied in the direction of electromagnetic wave re-radiation, measuring devices, instruments, etc., can solve the problems of large matching error, low precision, strong light dependence, etc., and achieve the effect of precise positioning

Pending Publication Date: 2022-01-14
HANGZHOU DIANZI UNIV
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Among them, the two-dimensional lidar is prone to the problems of large matching error and low precision in the open environment with few structural features and insufficient point cloud information; Unavailable; UWB needs to calibrate the base station in advance, which is complicated to use, and the position accuracy of the base station has a greater impact on its positioning accuracy

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cooperative localization method based on multi-modal map
  • Cooperative localization method based on multi-modal map
  • Cooperative localization method based on multi-modal map

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] In order to achieve more accurate positioning and navigation of unmanned vehicles, the present invention fuses different environmental information obtained by laser radar and visual camera scanning, combines the advantages of rapid matching of laser radar and accurate matching of visual marks, and obtains multi-type observations The global map of the data to achieve the effect of precise positioning and navigation of unmanned vehicles in the environment.

[0028] The implementation flow chart is as follows figure 1 As shown, a co-location method based on multi-modal maps, the steps are as follows:

[0029] Step 1: Obtain scene information through lidar scanning, and construct scene geometric mode map;

[0030] The unmanned vehicle is equipped with lidar to move inside the site, and the whole scene is scanned through the lidar to obtain the environmental information of the current unmanned vehicle location, and the geometric information of the surrounding corners, plane...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a cooperative localization method based on a multi-modal map. Different environment information obtained through scanning of a laser radar and a visual camera is fused to form the multi-modal map integrating a laser geometric mode and a visual texture mode, so that the effect of accurate positioning and navigation of an unmanned vehicle in the environment is achieved. Different from a traditional SLAM method, the method combines the advantage of rapid matching of the laser geometric mode and accurate matching of a visual texture mark mark, and by using the built brand new multi-mode map and the combined information, can sense the surrounding environment more accurately; and compared with an existing scheme, positioning is more accurate.

Description

technical field [0001] The invention relates to a positioning method of a mobile robot in a closed room, and relates to a cooperative positioning method of a mobile robot based on a laser geometric mode and a visual texture mode. Background technique [0002] With the advancement of technology and the rapid development of mobile robot research, ground unmanned vehicles (Unmanned Ground Vehicle, UGV) have played an increasingly important role in social life, while positioning and map construction (Simultaneous Localization and Mapping, SLAM) It has also become a hot research topic. An unmanned ground vehicle is a ground mobile platform that can drive autonomously or be operated by remote control, can be used once or multiple times, and can carry a certain amount of load. Due to the characteristics of automatic control and high intelligence, unmanned ground vehicles can often reach areas that are difficult for manned vehicles to reach or are very dangerous for humans, and com...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/20G01S17/86G01S17/931
CPCG01C21/20G01S17/86G01S17/931Y02T10/40
Inventor 陈若琳颜成钢许成浩朱尊杰孙垚棋张继勇李宗鹏
Owner HANGZHOU DIANZI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products