Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-robot co-localization and fusion mapping method under multi-view in open space

A co-location, multi-robot technology, applied in the field of computer vision, can solve the problems of limited viewing angle for mapping and inability to obtain a global map, and achieve the effect of reducing mapping errors and positioning drift.

Active Publication Date: 2019-04-05
ZHEJIANG UNIV OF TECH
View PDF7 Cites 34 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] Aiming at the current problem that the viewing angle of a single robot in a complex environment is limited and the global map cannot be obtained, the present invention uses an air-ground multi-viewpoint collaborative mapping and positioning method with the aid of aerial robots and ground robots to collaboratively map, aiming at efficiently and robustly Solve the problem of map sharing, fusion and localization among robots

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-robot co-localization and fusion mapping method under multi-view in open space
  • Multi-robot co-localization and fusion mapping method under multi-view in open space
  • Multi-robot co-localization and fusion mapping method under multi-view in open space

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0036] The present invention will be further described below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific implementation examples described here are only used to explain the present invention, and are not intended to limit the present invention; the drawings are only used for illustrative purposes and cannot be understood as limiting the patent; in order to better illustrate the embodiments, the appended Some components in the drawings may be omitted, enlarged or reduced, and do not represent the size of the actual product; for those skilled in the art, it is understandable that some known structures and their descriptions in the drawings may be omitted; The described positional relationship is for illustrative purposes only, and should not be construed as a limitation on this patent.

[0037] Based on aerial robots and ground robots to realize multi-robot collaborative positioning and fusion mapping under the multi-v...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-robot co-localization and fusion mapping method under multi-view in an open space. The method comprises the following steps of: completely covering a detection scene byusing multiple observations in the air and the ground, integrating scene image data collected by aerial robots and ground robots, and positioning the robots by visual constraints and restoring three-dimensional scene information; optimizing three-dimensional point cloud map of the mapping and positioning system and six-degree-of-freedom pose of the robots by specific visual features adhered to the robot. The pose optimization and map fusion algorithm based on visual features significantly improve reconstruction and positioning accuracy, revise map scales, so that local maps of each robot canbe shared among multiple heterogeneous robots, the coverage of three-dimensional reconstruction is improved, the reliable environmental information is quickly provided for mission planning, disaster environment search and rescue, and military anti-terrorism.

Description

technical field [0001] The invention belongs to the technical field of computer vision, and in particular relates to a multi-robot collaborative positioning and fusion mapping method under multiple perspectives in the open space, which is applicable to the construction of a three-dimensional point cloud map in complex scenes, multi-robot collaborative positioning, automatic task planning, and UAV surveying and mapping and other technical fields. Background technique [0002] Simultaneous localization and mapping (SLAM) technology has been extensively and deeply researched in recent years, but the problem of multi-robot, cross-field of view co-localization and mapping has not been well solved. In terms of collaborative positioning, the relative pose correction and information perception between multiple robots has become a difficult point. In the process of map building, the robot cannot know the prior information of the structure in the scene, and the map construction is mai...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/20G06K7/14
CPCG01C21/20G06K7/1417
Inventor 刘盛柯正昊陈一彬戴丰绩陈胜勇
Owner ZHEJIANG UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products