Unmanned aerial vehicle autonomous landing method based on visual identification

A technology of visual recognition and drones, applied in the field of drones, can solve problems such as the inability to adaptively determine the threshold value, the inability to specially design positioning algorithms, and the inability to guarantee positioning accuracy, so as to reduce attitude errors and improve positioning accuracy. High and ensure the effect of positioning accuracy

Inactive Publication Date: 2021-03-12
山东力聚机器人科技股份有限公司
View PDF1 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] Aiming at the deficiencies of the prior art, the present invention provides a method for autonomous landing of UAVs based on visual recognition, which sets fewer calibration parameters, can simplify the calibration process, can ensure positioning accuracy, and can adaptively determine the threshold value. Designing two positioning algorithms for long and short distances, the positioning accuracy is higher, and it solves the problem of the existing human-machine autonomous landing without visual recognition, which cannot simplify the calibration process, cannot adaptively determine the threshold value, and cannot specifically design two long and short distances Positioning algorithm, the problem that the positioning accuracy cannot be guaranteed

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Unmanned aerial vehicle autonomous landing method based on visual identification

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029]The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0030] see figure 1 , an autonomous landing method for UAVs based on visual recognition, including binocular camera correction and calibration 1—landing landmark selection 2—close landing landmark positioning 3—long-distance landing landmark positioning 4, the specific steps are as follows:

[0031] Binocular camera correction and calibration 1: By correcting and calibrating the camera, any point in the world coordinate system can be converted to the pixel coor...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to the field of unmanned aerial vehicles, and discloses an unmanned aerial vehicle autonomous landing method based on visual identification. The method comprises the steps of binocular camera correction and calibration, landing landmark selection, close-range landing landmark positioning and long-distance landing landmark positioning. According to the unmanned aerial vehicleautonomous landing method based on visual identification, the calibration process is simplified, the calibration precision is improved, and the positioning precision is guaranteed; and threshold segmentation is performed on an image by adopting a maximum between-cluster variance method, the threshold can be adaptively determined, calculation is facilitated, and two algorithms of short-distance landing landmark positioning and long-distance landing landmark positioning are respectively designed according to the distance between the unmanned aerial vehicle and the landing landmark, so the positioning precision of the unmanned aerial vehicle during autonomous landing is higher, and the risk of air crash caused by inaccurate positioning is reduced.

Description

technical field [0001] The invention relates to the field of unmanned aerial vehicles, in particular to a method for autonomously landing an unmanned aerial vehicle based on visual recognition. Background technique [0002] Unmanned aircraft is referred to as "unmanned aerial vehicle", and the English abbreviation is "UAV". Compared with manned aircraft, drones are often more suitable for tasks that are too "dull, dirty or dangerous". For military use, drones are divided into reconnaissance aircraft and target drones. For civilian use, drones + industry applications, It is the real rigid demand of drones. Currently, it is used in aerial photography, agriculture, plant protection, miniature selfies, express delivery, disaster relief, observation of wild animals, monitoring of infectious diseases, surveying and mapping, news reports, power inspections, disaster relief, film and television shooting, making romance, etc. Applications in fields such as drones have greatly expand...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G05D1/10
CPCG05D1/101
Inventor 赵越
Owner 山东力聚机器人科技股份有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products