Unmanned aerial vehicle autonomous navigation landing visual target tracking method

An unmanned aerial vehicle, visual autonomous technology, applied in the direction of instruments, character and pattern recognition, computer components, etc., can solve the problem of unreliable tracking of ground targets

Inactive Publication Date: 2012-10-10
BEIJING INSTITUTE OF TECHNOLOGYGY
View PDF3 Cites 26 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0025] During the autonomous vision-guided landing stage of an unmanned aerial vehicle, due to the high-speed movement of the airborne camera relative to the target, the position and shape of the target in the adjacent frames of the real-time video will change significantly, while the origina

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Unmanned aerial vehicle autonomous navigation landing visual target tracking method
  • Unmanned aerial vehicle autonomous navigation landing visual target tracking method
  • Unmanned aerial vehicle autonomous navigation landing visual target tracking method

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0057] The present invention will be described in detail below with reference to the drawings and embodiments.

[0058] Step 1. The airborne camera collects the template image of the landing target point and performs affine light normalization processing on the template image to obtain the pixel point gray value I of the normalized template image at point x norm (x), where x represents the coordinates of the pixels in the template image;

[0059] For a video sequence, the target motion can be regarded as a three-dimensional motion in the two-dimensional space and time dimensions. For this reason, the gray value of the pixel point of the real-time input image at time t at x = (x, y) is used as I(x , T) said. Will some t x The real-time video image at the moment is selected as the reference image. For the selected target area in the reference image, a point set containing N elements can be used , And the corresponding image gray value column vector I(x, t x )=[I(x 1 , T x ), I(x 2 ,...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an unmanned aerial vehicle autonomous navigation landing visual target tracking method. The method comprises the steps that: firstly, movement amplitude of a target between two consecutive frame images is divided into a plurality of resolution levels according to an order of 'coarse-to-fine'; prior movement simulation of different levels through an off-line training mode and calculation of corresponding prior error Jacobian matrixes are carried out; and as computing of each level of Jacobian matrix is combined with prior knowledge of training, an algorithm in an iterative search target process can be guaranteed to skip from a local optimum effectively to avoid tracking failure. The target is described by using sparse features of a template image target area, namely a gray value of a FAST corner point portion. Compared with a traditional Lucas-Kanade algorithm, complexity of the algorithm provided by the invention is reduced greatly, wherein dense expression targets of all pixels of the target area are often used in the traditional algorithm.

Description

technical field [0001] The invention relates to an inverse compound target tracking method based on multi-resolution motion prior, which is especially suitable for the stable tracking of targets in the process of visual autonomous guidance and landing of unmanned aerial vehicles, and belongs to the field of digital image processing. Background technique [0002] Visual autonomous landing of UAV is a hot issue in the research field of UAV control. It uses digital image processing technology to obtain positioning parameters, and has many advantages such as simple equipment, low cost, and large amount of information obtained. Compared with GPS and inertial navigation, it is completely autonomous and passive. The rapid and stable matching and tracking of the scheduled landing site template image and the airborne real-time image are the prerequisites for precise landing control. During the landing process of the UAV, the target matching image often has rotation, scale and viewi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06K9/60
Inventor 郑智辉汪渤高志峰周志强董明杰石永生沈军李笋王海螺
Owner BEIJING INSTITUTE OF TECHNOLOGYGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products