Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Autonomous landing method and system for unmanned aerial vehicle based on monocular vision and electronic device

An autonomous landing and monocular vision technology, applied in the field of visual navigation, can solve the problem of low positioning accuracy

Inactive Publication Date: 2017-09-22
WUHAN UNIV OF SCI & TECH
View PDF5 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] Embodiments of the present invention provide a method, system and electronic equipment for autonomous landing of UAVs based on monocular vision, to solve the technical problem of low positioning accuracy in UAV landing methods in the prior art

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Autonomous landing method and system for unmanned aerial vehicle based on monocular vision and electronic device
  • Autonomous landing method and system for unmanned aerial vehicle based on monocular vision and electronic device
  • Autonomous landing method and system for unmanned aerial vehicle based on monocular vision and electronic device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0023] This embodiment provides a method for autonomous landing of drones based on monocular vision, please refer to figure 1 , the method includes:

[0024] Step S101: Obtain a Gaussian pyramid image according to the pre-acquired aerial image;

[0025] Step S102: performing edge detection on the Gaussian pyramid image to obtain an edge image;

[0026] Step S103: Obtain a first straight line and a second straight line from the edge image by using a coarse-scale Hough transform method;

[0027] Step S104: Obtaining a third straight line and a fourth straight line by using a selective iterative random sampling consensus algorithm on the first straight line and the second straight line,

[0028] Step S105: Obtain the landing parameters of the UAV according to the third straight line and the fourth straight line, so that the UAV can land autonomously.

[0029] It should be noted that, in this application, Gaussian pyramid images are first obtained based on pre-acquired aerial i...

Embodiment 2

[0070] Based on the same inventive concept as Embodiment 1, Embodiment 2 of the present invention provides a system for autonomous landing of drones based on monocular vision. Please refer to figure 2 , the system includes:

[0071] The first obtaining module 201 is used to obtain a Gaussian pyramid image according to a pre-acquired aerial image;

[0072] The second obtaining module 202 is configured to perform edge detection on the Gaussian pyramid image to obtain an edge image;

[0073] A third obtaining module 203, configured to obtain the first straight line and the second straight line from the edge image by using a coarse-scale Hough transform method;

[0074] A fourth obtaining module 204, configured to obtain a third straight line and a fourth straight line by using a selective iterative random sampling consensus algorithm on the first straight line and the second straight line,

[0075] The fifth obtaining module 205 is configured to obtain the landing parameters o...

Embodiment 3

[0100] Based on the same inventive concept as that of Embodiment 1, Embodiment 3 of the present invention provides an electronic device. Please refer to Figure 6 , comprising a memory 301, a processor 302 and a computer program stored on the memory 301 and operable on the processor 302, the processor 302 implements the following steps when executing the program:

[0101] Obtain a Gaussian pyramid image based on the pre-acquired aerial image;

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an autonomous landing method and a system for an unmanned aerial vehicle based on monocular vision and an electronic device. The method comprises steps: according to an aerial image acquired in advance, a Gaussian pyramid image is acquired; edge detection is carried out on the Gaussian pyramid image to acquire an edge image; a coarse scale Hough transform method is adopted to acquire a first straight line and a second straight line from the edge image; a selective iteration random sampling consensus algorithm is adopted for the first straight line and the second straight line to acquire a third straight line and a fourth straight line; and according to the third straight line and the fourth straight line, landing parameters of the unmanned aerial vehicle are acquired to enable the unmanned aerial vehicle to perform autonomous landing. The technical problem that the unmanned aerial vehicle landing method in the prior art is not high in positioning precision can be solved.

Description

technical field [0001] The present invention relates to the technical field of visual navigation, in particular to a monocular vision-based autonomous landing method, system and electronic equipment for an unmanned aerial vehicle. Background technique [0002] In the field of military aviation, UAVs with autonomous landing capabilities are currently a research hotspot. Autonomous landings of UAVs put forward high requirements for the accuracy, speed and reliability of navigation. At present, the autonomous landing navigation of drones is mainly divided into two categories: navigation based on satellite GPS and navigation based on vision. GPS-based navigation is easy to use, but it will be completely ineffective in wartime; vision-based autonomous navigation can reduce the dependence on external signals during the landing process of UAVs, making UAVs more autonomous in landing. [0003] In the prior art, the UAV autonomous landing technology is mainly based on the detection ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/13G01C11/04G01P13/02
CPCG06T7/13G01C11/04G01P13/025G06T2207/10004G06T2207/20061
Inventor 张俊勇伍世虔宋运莲陈鹏张琴
Owner WUHAN UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products