Unlock instant, AI-driven research and patent intelligence for your innovation.

A multi-UAV collaborative navigation method and device based on inertial and binocular vision

A collaborative navigation and multi-UAV technology, applied in navigation, surveying devices, surveying and navigation, etc., can solve the problems of inapplicability and low accuracy of navigation data

Active Publication Date: 2021-08-10
NAT UNIV OF DEFENSE TECH
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

At present, the cooperative navigation technology for UAV clusters has the following characteristics: First, most of them are limited to specific application scenarios such as cooperative air refueling and ground target tracking. The number of UAVs is mostly two, and most of the attention is on UAVs. The relative navigation information among them; second, UAV swarms are still not completely separated from the dependence on satellite navigation, and even realize cooperative navigation based on satellite navigation function, which is not suitable for the working environment without satellite signals; third, UAV The relative observation between them is measured by distance or relative azimuth. The common distance measurement sensor is an ultra-wideband ranging sensor, and the relative azimuth measurement sensor is a monocular camera, which can provide navigation data with low accuracy.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A multi-UAV collaborative navigation method and device based on inertial and binocular vision
  • A multi-UAV collaborative navigation method and device based on inertial and binocular vision
  • A multi-UAV collaborative navigation method and device based on inertial and binocular vision

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0049] In order to make the purpose, technical solution and advantages of the present application clearer, the present application will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present application, and are not intended to limit the present application.

[0050] In one embodiment, such as figure 1As shown, a multi-UAV cooperative navigation method based on inertia and binocular vision is provided, including the following steps:

[0051] Step 102, based on the inertial navigation error model of each UAV, establish a multi-UAV cooperative navigation system state model.

[0052] Specifically, the inertial navigation error model of each UAV can be obtained based on various existing modeling methods in a selected coordinate system according to the inertial navigation system equipped with each UAV. The inertial navigation error...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

This application relates to a multi-UAV cooperative navigation method and device based on inertial and binocular vision. The method includes: establishing a multi-UAV cooperative navigation system state model based on the inertial navigation error model of each UAV, predicting the multi-UAV system state based on the inertial navigation solution and the cooperative navigation system state model, and The human-machine binocular vision relative position observation data updates the system state, obtains the inertial navigation solution error estimation of each UAV, and corrects the inertial navigation solution result to realize multi-UAV collaborative navigation. The above method corrects the inertial navigation system error of each UAV based on binocular vision relative position observation, and utilizes the feature that binocular vision position observation information contains both the distance and relative orientation between UAVs to achieve higher than distance measurement or relative orientation. Measured cooperative navigation accuracy. In addition, the above method establishes the inertial navigation error model for each UAV separately, so it is suitable for the situation where the accuracy of the inertial navigation system of multiple UAVs is different.

Description

technical field [0001] The present application relates to the technical field of platform cooperative navigation, in particular to a multi-UAV cooperative navigation method and device based on inertial and binocular vision. Background technique [0002] At present, drones generally use inertial / satellite integrated navigation technology. However, on the one hand, satellite navigation signals will be blocked in environments such as urban roadways and canyons, and on the other hand, they are easily interfered. Therefore, UAVs cannot rely solely on satellites as a means of navigation, but need to achieve autonomous navigation based on technologies such as inertial navigation in an environment where satellite navigation signals cannot be obtained, so as to adapt to various complex working environments. [0003] For UAV clusters, the cooperative navigation technology of relative observation between UAVs can be used to improve the inertial navigation positioning accuracy of UAVs....

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G01C21/16G01C21/20G06T7/73
CPCG01C21/165G01C21/20G06T7/75
Inventor 穆华谢嘉潘献飞
Owner NAT UNIV OF DEFENSE TECH