Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Vision-based autonomous unmanned plane landing guidance device and method

A technology of autonomous landing and guidance device, which is applied in the directions of measurement device, navigation, surveying and mapping, and navigation, etc. It can solve the problems of long-distance detection of weak and small infrared targets, and the inability to obtain the attitude, speed and acceleration of the UAV.

Active Publication Date: 2014-12-17
NORTHWESTERN POLYTECHNICAL UNIV
View PDF5 Cites 45 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this system mainly uses a single-frame target weight calculation method to detect and track moving targets in complex backgrounds. Although this method calculates the weights of targets on color or grayscale images that basically conform to human vision, it is limited by infrared The defect that the detection distance of the LED does not exceed 200 meters, there is a big problem in the long-distance detection of weak infrared targets; moreover, although the system can accurately obtain the position information of the aircraft relative to the runway in real time, it cannot obtain the attitude of the UAV , speed, acceleration and other information, it is impossible to accurately guide the UAV autonomous landing based on vision

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Vision-based autonomous unmanned plane landing guidance device and method
  • Vision-based autonomous unmanned plane landing guidance device and method
  • Vision-based autonomous unmanned plane landing guidance device and method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0038] Now in conjunction with embodiment, accompanying drawing, the present invention will be further described:

[0039] A vision-based autonomous landing guidance device for unmanned aerial vehicles, including a measuring camera, a visible light flashlight, a total station, a cooperation sign light, a tripod, a prism and a computer. There are 4 cameras, and its model is PointGrey Flea3-FW -03S1C / M-C high frame rate measurement camera, using 1 / 4 CCD sensor, frame rate up to 120Hz, resolution 640×480, camera size 3cm×3cm×6cm, camera base size 1cm×9cm ×11cm. Its installation location is as Figure 1-2 As shown, two of the measurement cameras are equipped with a 12mm telephoto lens for long-distance aerial UAV target detection and positioning, and two measurement cameras are equipped with an 8mm short-focus lens for precise taxiing positioning after the UAV enters the runway area; four The measurement cameras are divided into two groups, each group includes a long-focus measu...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a vision-based autonomous unmanned plane landing guidance device and method. The method comprises the following steps: carrying out real-time detection on a strong light identification lamp which is carried right ahead an unmanned plane after entering an autonomous landing guidance runway by utilizing measurement cameras which are subjected to off-line calibration and are arranged on two sides of the runway, and acquiring three-dimensional space position information of the unmanned plane via a binocular stereo vision measurement technology by utilizing the four cameras which are accurately calibrated within a large-scale scene range so as to trace and position the unmanned plane to obtain the position, the speed and other flight parameters of the unmanned plane in real time; and transmitting the flight parameters into a flight control system by virtue of a wireless data transmission chain and regulating the flight parameters via the flight control system according to a current status of the unmanned plane to guarantee that the unmanned plane is in stable flight, thereby accurately realizing the autonomous landing of the unmanned plane.

Description

technical field [0001] The invention relates to a vision-based autonomous landing guidance device and guidance method for UAVs, in particular to a set of UAV autonomous landing guidance methods based on a long-short-focus camera optimized layout device. Background technique [0002] UAV autonomous landing refers to the process in which UAV uses navigation equipment and flight control system for positioning and navigation, and finally controls the UAV to land and land. Autonomous landing has high requirements for navigation and control accuracy and reliability, and is an important foundation and key technology for autonomous flight of UAVs. Traditional navigation technologies for autonomous UAV landing include: Inertial Navigation System (INS), GPS Navigation System and INS / GPS Integrated Navigation System. The inertial navigation system uses inertial components such as gyroscopes and accelerometers to sense the acceleration of the vehicle during motion, and then obtains nav...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/00
CPCG01C21/00
Inventor 张艳宁杨涛陈挺余瑞张晓强冉令燕卓涛
Owner NORTHWESTERN POLYTECHNICAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products