Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Moving target tracking method based on template matching and deep classification network

A deep classification and template matching technology, applied in the field of image processing, can solve the problem that the model cannot continue to accurately track the target, cannot achieve long-term accurate tracking, and the classifier recognition ability is not strong enough, and achieves the effect of accurately tracking the target.

Active Publication Date: 2019-07-19
XIDIAN UNIV
View PDF13 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The disadvantage of this method is that the weighted result of the target template and the background template is used as the confidence value, which fails to reflect the fluctuation of the response of the target to be tracked, and the recognition ability of the trained classifier is not strong enough. , Long-term and accurate tracking cannot be achieved when the target moves quickly
The disadvantage of this method is that it is necessary to determine the sample type of each local area of ​​the image by setting a threshold. When the target to be tracked is occluded to a large extent, the target sample or background sample will be misclassified, resulting in an updated model. Cannot continue to track target accurately

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Moving target tracking method based on template matching and deep classification network
  • Moving target tracking method based on template matching and deep classification network
  • Moving target tracking method based on template matching and deep classification network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0036] The embodiments and effects of the present invention will be further described below in conjunction with the accompanying drawings.

[0037] refer to figure 1 , the concrete steps of the present invention are as follows.

[0038] Step 1, build a dual residual deep classification network model.

[0039] 1.1) Set up the front-end network:

[0040] Adjust the input layer parameters of the existing two deep residual neural networks ResNet50, where the number of neurons in the input layer of the first network is set to 224×224×3, and the number of neurons in the input layer of the second network is set to The number is set to 448×448×3, and the parameters of other layers remain unchanged, and these two deep residual neural networks are used as the front-end network of the dual residual deep classification network model;

[0041] 1.2) Set up the backend network:

[0042] Two three-layer fully connected networks are built as the back-end network of the dual residual deep cla...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a moving target tracking method based on template matching and a deep classification network, and mainly solves the problems of low target detection speed and inaccurate tracking when a target is deformed in appearance and shielded in the prior art. The implementation scheme comprises the following steps: 1) establishing a dual-residual deep classification network, and training the dual-residual deep classification network; 2) extracting a template network and a detection network from the dual-residual deep classification network; 3) extracting template characteristicsby using a template network; 4) extracting detection characteristics by using a detection network; 5) carrying out template matching on the template features on the detection features to obtain a template matching graph; 6) determining a target position according to the template matching graph; (7) updating template characteristics according to the target position, (8) judging whether the currentframe is the last frame or not, if yes, ending target tracking, otherwise, taking the updated template characteristics as the template characteristics of the next frame, and returning to the step (4).The method has the advantages of high tracking speed and high accuracy, and is used for tracking the violent deformation and illumination change video target.

Description

technical field [0001] The invention belongs to the technical field of image processing, and further relates to a moving target tracking method, which can be used for tracking video targets of such types as severe deformation, camera shake, scale change, and illumination change. Background technique [0002] The main task of moving target tracking requires to learn a tracker when only the initial frame information of the target to be tracked is known, so that the tracker can accurately predict the position of the target to be tracked in the next frame of the video sequence. With the continuous deepening of people's understanding of the field of computer vision, moving object tracking has been widely used and developed in this field. Due to the continuous application of deep learning in the field of image classification and image segmentation, deep learning methods have gradually been applied to the field of object tracking. Compared with the manual feature extraction method ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/20
CPCG06T7/20G06T2207/10016G06T2207/20081G06T2207/20084
Inventor 田小林李芳李帅李娇娇荀亮贾楠
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products