Residual depth feature target tracking method for drift detection

A deep feature and target tracking technology, which is applied in the fields of image processing and computer vision, can solve the problems that the tracking speed cannot meet the needs of real-time tracking, the real-time tracking performance can not meet the requirements well, and unfavorable target tracking and other problems

Active Publication Date: 2018-12-07
NANJING UNIV OF INFORMATION SCI & TECH
View PDF5 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In recent years, target tracking has been based on JF Henriques et al. (Henriques J F, Rui C, Martins P, et al. High-Speed ​​Tracking with Kernelized Correlation Filters [J]. IEEE Transactions on Pattern Analysis & Machine Intelligence, 2014,37(3):583- 596) proposed kernel correlation filtering algorithm is the most representative, has a very high tracking speed, so that people can see the possibility of real-time tracking, but the traditional features used lead to poor tracking effect
In the same year, the deep convolutional neural network proposed by K Simonyan et al. (Simonyan K, Zisserman A.Very deep convolutional networks for large-scale image recognition[J].arXiv preprint arXiv:1409.1556,2014.) was greatly improved in ILSVRC-2014. Brilliant, deep learning began to play a role in computer vision, in 2015 by K He et al (He K, Zhang X, Ren S, et al.Deep Residual Learning for Image Recognition[J].2015:770-778.) The proposed residual network, where the applied residual structure provides a way to train a deeper network, but the real-time performance of tracking cannot meet the requirements well
In subsequent studies, the deep network was used as end-to-end tracking, such as H Nam et al. (Nam H, Han B. Learning Multi-domain Convolutional Neural Networks for Visual Tracking [C]//Computer Vision and Pattern Recognition.IEEE , 2016:4293-4302.) proposed a multi-domain neural network that takes the original image as input, directly outputs the tracking result, and uses the deep n

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Residual depth feature target tracking method for drift detection
  • Residual depth feature target tracking method for drift detection
  • Residual depth feature target tracking method for drift detection

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0064] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be described in detail below with reference to the accompanying drawings and specific embodiments.

[0065] Such as Figure 4 It is a method frame diagram of the present invention, and concrete steps are as follows:

[0066] (1) Training residual deep feature network

[0067] For the entire network structure see figure 1 , containing 4 convolutional layers, 2 fully connected layers and 1 residual structure. The specific operations included in Conv1 are convolutional layer → BN (Batch Normalization) layer → pooling layer. The convolution layer contains multiple convolution kernels, which can extract features from different aspects, and these features can distinguish the target most essentially; as the normalization layer of the network, the BN layer can normalize the output of the convolution layer, and can Speed ​​up the training network, preve...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a residual depth feature target tracking method for drift detection. Hierarchical features are extracted by using a convolutional neural network; and a residual structure is added into the convolutional neural network and different network layers are connected to realize fusion of shallow and deep features, so that manual design of a feature fusion mode is not required. With the network structure, the feature fusion function is realized automatically; the target and the background are distinguished by the deep features; and the resolution is improved by being compared with that of the traditional feature. When a target position of a current frame is predicted, a strategy of detecting a model drift is put forward and a response strength reduction counter is designed.Counting is carried out by comparing response strengths of adjacent frames; and whether a model drift occurs is determined based on the value of the counter, so that a corresponding model update scheme is used as a remedial measure to realize precise tracking.

Description

technical field [0001] The invention relates to the technical fields of image processing and computer vision, in particular to a residual depth feature target tracking method for drift detection. Background technique [0002] In every traffic section, there are surveillance cameras. They observe the passing vehicles all the time, and confirm and track the identity information of the offending vehicles. Among them, the target tracking technology used is already one of the core research topics in the field of computer vision. It has a wide range of applications in real life, not only in traffic monitoring, smart phones, intelligent robots, autonomous driving, military and other fields, it plays an important role. [0003] Traditional target tracking algorithms cannot achieve good tracking results when encountering difficulties such as target deformation, illumination changes, and background clutter, and cannot meet people's needs. With the vigorous development of deep learnin...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T7/246G06K9/62
CPCG06T7/248G06T7/251G06T2207/20084G06T2207/20081G06T2207/10016G06T2207/30232G06F18/253
Inventor 胡昭华郑伟钱坤
Owner NANJING UNIV OF INFORMATION SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products