Aerial photography target tracking method fusing target saliency and online learning interference factors

A technology for learning interference and targets, applied in the field of computer vision, to achieve reliable adaptive matching tracking, effective and reliable adaptive matching tracking, and speed up tracking

Inactive Publication Date: 2021-05-25
HEFEI UNIV OF TECH
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

A recent successful model has a fully convolutional twin network tracking algorithm. Although it not only achieves good tracking accuracy but also satisfies real-time performance, it lacks an effective online update model to capture the time of the target, background or imaging conditions in the aerial scene. Variety

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Aerial photography target tracking method fusing target saliency and online learning interference factors
  • Aerial photography target tracking method fusing target saliency and online learning interference factors
  • Aerial photography target tracking method fusing target saliency and online learning interference factors

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0058] In this embodiment, an aerial photography target tracking method that combines target saliency and online learning interference factors, such as figure 1 As shown, proceed as follows:

[0059] Step 1. Pre-train the general features of the fully convolutional twin network;

[0060] In this embodiment, a fully convolutional twin network is used, including 5 convolutional layers and 2 pooling layers, and each convolutional layer is followed by a normalization layer and an activation function layer;

[0061] Step 1.1. Obtain a tagged aerial data set. The aerial data set contains multiple video sequences, and each video sequence contains multiple frames. Select a video sequence and extract any i-th frame image and adjacent T frames in the current video sequence. Any frame of image in the image forms a sample pair, so that the randomly selected images in the current video sequence form several sample pairs, and then constitute the training data set;

[0062] Step 1.2, using...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an aerial photography target tracking method fusing target saliency and online learning interference factors, and the method comprises the steps: selecting a most effective feature channel from a pre-trained convolutional network according to a back transmission gradient to generate an aerial photography intelligent sensing feature, and when highlighting the features of an aerial photography target, increasing tracking speed by greatly reducing channel feature quantity; and fully using abundant context information of the continuous video to guide the target appearance model and the current frame to learn interference factors of a dynamic target online as similar as possible, so that the influence caused by significant change of aerial photography can be inhibited, and reliable adaptive matching tracking is realized. Difference between the pre-trained classification depth model and target tracking of a specific aerial photographing scene can be reduced, and the online adaptive capacity of the model is improved, so that the real-time tracking requirement of an aerial photographing video is met.

Description

technical field [0001] The invention belongs to the technical field of computer vision, and in particular relates to an aerial photography target tracking method based on twin network fusion saliency and interference online learning. Background technique [0002] With the rapid development of drones and computer vision, intelligent target tracking systems based on drones are widely used in various fields such as target monitoring and military anti-terrorism reconnaissance. Aerial video has the characteristics of large amount of information, complex background, uncertain field of view, and small tracking target. However, existing target tracking algorithms have not been fully designed and optimized for these characteristics. Therefore, it is still difficult to achieve robust and real-time tracking in aerial video. is a huge challenge. [0003] The existing mainstream target tracking algorithms are all based on deep learning, and they are mainly divided into two categories: t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/04G06N3/08G06V20/13G06V20/41G06V20/46G06V20/48G06F18/241G06F18/214
Inventor 孙锐方林凤梁启丽张旭东
Owner HEFEI UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products