Unmanned aerial vehicle autonomous vision landing method based on deep hybrid camera array

A hybrid camera and depth camera technology, applied in the field of unmanned aerial vehicles, can solve the problem that the two-dimensional code image is too small and cannot be matched smoothly, the poor attitude and position calibration effect affects the accuracy of pose estimation, and cannot meet the high precision landing of unmanned aerial vehicles, etc. problem, to achieve the effect of achieving high-precision landing of autonomous vision, improving visual positioning accuracy, and improving accuracy

Pending Publication Date: 2022-04-29
NORTHWESTERN POLYTECHNICAL UNIV
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This method uses the airborne camera to identify the ground two-dimensional code and other information carriers to identify the landing position and land, but there is a problem that the drone’s two-dimensional code image is too small to match smoothly at medium and high altitudes, and the attitude and position calibration effect is poor, which affects the pose estimation. The problem of accuracy cannot meet the requirements of high-precision landing of drones

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Unmanned aerial vehicle autonomous vision landing method based on deep hybrid camera array
  • Unmanned aerial vehicle autonomous vision landing method based on deep hybrid camera array
  • Unmanned aerial vehicle autonomous vision landing method based on deep hybrid camera array

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0026] 1. Build a deep hybrid camera array

[0027] Build a deep hybrid camera array. In the present invention, two RGB-D depth cameras are combined to form a camera array, and two left and right color cameras are combined to form a binocular camera for stereo matching distance measurement. figure 1 The mid-depth hybrid camera array building block is a schematic structural diagram of the entire camera array. The device can be used to obtain two channels of active light measurement depth maps and two channels of left and right color images. The selected depth camera has the ability of hard trigger synchronous acquisition, which can ensure that the captured images are synchronized and consistent. The whole device uses a 3-axis self-stabilizing gimbal to ensure stability, so that the camera can face vertically downward relative to the ground. This invention selects DJI Jingwei series UAVs as the flight platform. They have high load and excellent flight performance. They adopt ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to an unmanned aerial vehicle autonomous vision landing method based on a deep hybrid camera array, and belongs to the technical field of unmanned aerial vehicles. The method fully combines the advantages that the RGB-D depth camera is accurate in short-distance depth estimation and good in low-texture scene work and the binocular color camera is wide in long-distance binocular depth calculation range, the problem that the unmanned aerial vehicle sets the height at different heights through a visual sensor is solved, and the limitation of a single camera is overcome. The invention provides a virtual mark construction method, and solves the problem that an unmanned aerial vehicle cannot be positioned due to the fact that the unmanned aerial vehicle cannot detect a ground tiny mark marker through vision at a high altitude. According to the invention, a fusion map of multi-depth feature information is constructed by using three-dimensional points recovered by a monocular depth map and three-dimensional point cloud calculated by stereo matching of a binocular color map, and height information measured by a built camera array is fused at middle and low altitudes, so that high-precision all-around autonomous landing of the unmanned aerial vehicle through a pure vision sensor is realized.

Description

technical field [0001] The invention relates to an autonomous landing method for visual navigation of an unmanned aerial vehicle based on a deep hybrid camera array, and belongs to the technical field of unmanned aerial vehicles. It uses a camera array composed of RGB-D depth cameras to establish a virtual marker point hybrid map to achieve long-distance high-precision visual landing of drones. Background technique [0002] The position and attitude information of the UAV is the core of the autonomous landing of the UAV. In order to make the UAV land smoothly, the UAV needs to sense the landing area at a long distance, and real-time based on the current position and attitude information during the landing process. Adjust, and finally complete the landing in the designated area. Commonly used landing schemes include: 1) manual control. This solution requires the operator to remotely control the drone to land. This method generally requires the operator to be familiar with ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G05D1/06
CPCG05D1/0676
Inventor 李东东杨涛王志军李烨范婧慧宁雅佳秦海栋张芳冰
Owner NORTHWESTERN POLYTECHNICAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products