Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-robot relative positioning method based on multi-sensor fusion

A multi-sensor fusion and multi-robot technology, applied in the field of UAV perception and positioning research, can solve problems such as distance, improve positioning accuracy, improve accuracy and calculation speed, and accelerate the effect of observation update frequency

Pending Publication Date: 2021-11-05
NORTHWESTERN POLYTECHNICAL UNIV
View PDF2 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

It can be seen that the high-precision and high-robust positioning scheme for a single machine is relatively mature, but there is still a big gap in the high-precision relative positioning of multiple machines
UAVs move freely in three-dimensional Euclidean space and are far apart (greater than the working range of general lidar), which puts more restrictions on the selection and fusion of sensors

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-robot relative positioning method based on multi-sensor fusion
  • Multi-robot relative positioning method based on multi-sensor fusion
  • Multi-robot relative positioning method based on multi-sensor fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0036] Further, the drawings are now further described with reference to the present invention:

[0037] It is an object of the present invention to design a mutual positioning method based on a multi-sensor fusion. The configuration perception of the multi-drone consensus is intended to provide a stamped information for the control module.

[0038] In order to achieve the above object, the technical solution used in the present invention includes the following steps:

[0039] 1) Single-machine visual position estimate for other drones in visual visibility;

[0040] 2) IMU information and visual estimation and UWB ranging information fusion;

[0041] 3) Multi-machine positioning results are fused.

[0042] The step 1) estimate of the single machine to other drones:

[0043] 1.1) Setting of cooperation identification: Use AprilTag as a cooperation identifier, black and white print, and the outer frame is 0.5m.

[0044] 1.2) Determine the system topology: The performance of the airb...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a multi-robot relative positioning method based on multi-sensor fusion, and aims to realize a configuration sensing function during cooperative transportation of multiple unmanned aerial vehicles and provide pose information for formation keeping of a control module. According to the technical scheme, the method comprises the following steps that a single machine estimates visual poses of other unmanned aerial vehicles in a visual visible range; IMU information is fused with visual estimation and UWB ranging information; and positioning results are fused among multiple machines. Due to the fact that the cooperation identification with the determined size is adopted, the robustness, the precision and the calculation speed of visual positioning are all improved, and the observation updating frequency is increased; and the multi-sensor combined solution of IMU prediction, UWB distance measurement and visual image observation is adopted, so that the positioning precision among multiple robots is further improved, and the relative configuration of the system can be kept without relying on GPS information. The ring network topology is adopted for formation estimation, so that the system can still operate under the condition of partial observation failure, and has certain redundancy.

Description

Technical field [0001] The present invention belongs to the field of drone perception and positioning, involving a multi-sensor-fused multi-robot relative positioning method, specifically a multi-sensor-fused multi-drone multi-drone coordinated transport system mutual positioning method. Background technique [0002] Multi-robot synergy has the characteristics of convenient combination, flexible, redundancy, etc., has achieved long-term development in recent years. In the air transportation operation, the multi-drone system has the advantages of low cost and convenient maintenance. Depending on the load capacity, the load capacity of the system can be increased by increasing the number of units. Multi-drones composed of drone units, flexible teslers, load connecting devices, and ground stations. Each drone unit is equipped with a camera, an inertial sensor, an ultra-wideband (UWB) beacon, a compute unit, and a control system, and real-time communication between each other. At wor...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G05D1/12
CPCG05D1/12Y02D30/70
Inventor 黄攀峰杨立张帆张夷斋
Owner NORTHWESTERN POLYTECHNICAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products