Unlock instant, AI-driven research and patent intelligence for your innovation.

An end-to-end estimation method for UAV landing space position and attitude

A technology of spatial position and unmanned aerial vehicle, applied in neural learning methods, navigation calculation tools, biological neural network models, etc. , the influence of factors such as changes in perspective, etc.

Active Publication Date: 2021-03-02
NAT UNIV OF DEFENSE TECH
View PDF14 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

At present, using the principle of binocular ranging, PnP Traditional methods such as problem solving estimate the spatial position and attitude of targets based on two-dimensional images, which often require multiple independent serial processes such as UAV target detection, point / line / surface feature extraction, and pose calculation. The accuracy is easily affected by factors such as illumination, background, and viewing angle changes, and there are deficiencies in robustness
At present, although the related methods of end-to-end estimation of UAV pose through deep neural network have improved the environmental robustness compared with traditional methods, these methods only use a single frame image of UAV, ignoring the The time-domain dependence contained in the image of the aircraft landing sequence, so the pose estimation accuracy is not high

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • An end-to-end estimation method for UAV landing space position and attitude
  • An end-to-end estimation method for UAV landing space position and attitude
  • An end-to-end estimation method for UAV landing space position and attitude

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0041]The technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only a part of the embodiments of the present invention, rather than all the embodiments. Based on the embodiments of the present invention, all other embodiments obtained by those of ordinary skill in the art without creative work shall fall within the protection scope of the present invention.

[0042]It should be noted that all directional indications (such as up, down, left, right, front, back...) in the embodiments of the present invention are only used to explain the difference between components in a specific posture (as shown in the accompanying drawings). If the relative position relationship, movement situation, etc. change, the directional indication will change accordingly.

[0043]In addition, the descriptions related to "first...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for end-to-end estimation of the spatial position and attitude of a UAV landing space, comprising the following steps: Step 1, obtaining a sequence of real-time images of UAV landing captured by ground-based vision, and based on the sequence of real-time images of UAV landing in each frame. The time domain correlation between the UAV and the ground-based camera is obtained; step 2, based on the transformation matrix between the ground-based camera and the world coordinate system, the pose of the UAV relative to the ground-based camera is transformed into the space in the world coordinate system pose. Relying on the theory of machine learning, fully drawing on the outstanding achievements of deep learning in the field of computer vision, by designing a deep neural network for the relative space pose of the UAV target, it realizes the transformation from the image to the UAV space pose during the landing process of the UAV. Compared with the traditional multi-module serial method, the end-to-end direct estimation greatly improves the adaptability of the pose estimation to environmental lighting, visual background and observation angle during the landing process of the UAV.

Description

Technical field[0001]The invention relates to the technical field of autonomous landing of unmanned aerial vehicles, in particular to a method for end-to-end estimation of the unmanned aerial vehicle's spatial position and posture according to ground-based visual images during the landing of the unmanned aerial vehicle.Background technique[0002]In the process of autonomous take-off and landing of UAVs, real-time acquisition of the position and attitude information of the UAV itself according to the global satellite positioning system and inertial navigation system is the main means for the current UAV to achieve autonomous positioning and attitude determination during the landing process. Considering that factors such as magnetic field and temperature in the environment can easily cause interference to the airborne positioning system, during the entire landing process of the drone, relying on the airborne positioning system alone cannot guarantee the stable and accurate pose informa...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G01C21/20G06N3/04G06N3/08
CPCG01C21/20G06N3/049G06N3/08G06N3/045
Inventor 唐邓清相晓嘉周晗常远闫超周勇黄依新兰珍李贞屹李子杏
Owner NAT UNIV OF DEFENSE TECH