Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Target tracking method based on deep convolution feature adaptive integration

A deep convolution and target tracking technology, applied in the fields of image processing and computer vision image processing, can solve the problems of inaccurate target position, unstable tracking, and inability to make full use of the tracker to avoid ambiguity, enhance accuracy and reliability. sexual effect

Active Publication Date: 2020-12-08
XIDIAN UNIV
View PDF10 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The purpose of the present invention is to address the shortcomings of the above-mentioned prior art, and propose a target tracking method based on adaptive integration of deep convolution features, which is used to solve the problem that the tracker cannot make full use of the target features of different channels when there are similar disturbances around the target. The tracking information is not stable enough, and the obtained target position is not accurate enough

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Target tracking method based on deep convolution feature adaptive integration
  • Target tracking method based on deep convolution feature adaptive integration
  • Target tracking method based on deep convolution feature adaptive integration

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] The following Examples further embodiment and effects of the present invention is described with the accompanying drawings.

[0035] Refer figure 1 , The step of carrying out the invention further described.

[0036] Step 1, the depth of convolution feature extraction.

[0037] Selecting an unselected image over the video image sequence from a target to be tracked is contained in the current frame.

[0038] All the pixels contained within the target region of the current frame is input to a convolutional neural networks VGG-19, the network of the first layer 10, second layer 28, wherein stitching the first three channels of the output layer 37 as a target area multichannel convolution feature depth.

[0039] Step 2, calculating correlation filter core.

[0040] Step 1, according to the following formula to calculate the current frame correlation filter core current iteration:

[0041]

[0042] Among them, α j Indicates that the current frame iteration j correlation filter...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a target tracking method based on deep convolution feature adaptive integration. The method comprises the steps of extracting deep convolution features; calculating a kernel correlation filter; updating the integrated vector of the current frame by using an integrated vector updating formula; predicting a target position of the current frame image by using a self-adaptive integrated calculation formula; updating the deep convolution feature of the current frame by using a deep convolution feature updating formula; and taking the target center position of the current frame as the center position of the to-be-tracked target when the iteration of the video image sequence containing the to-be-tracked target is ended. By integrating the features, the defect that a tracker in the prior art cannot fully utilize information contained in different channel target features is overcome, so that the position of the target to be tracked is more accurately acquired in the target tracking process, and the accuracy and reliability of target tracking are enhanced.

Description

Technical field [0001] The present invention belongs to the technical field of image processing, computer vision still further relates to a technical field of image processing based on the depth of target tracking and adaptive convolution integration features. The present invention employs an adaptive integration method based on convolution filter associated feature depth, for video surveillance, moving object in the field of health care, intelligent transportation, robot navigation, human-computer interaction, virtual reality tracking. Background technique [0002] The main task is to estimate the trajectory of target tracking an object in the video, i.e., detect moving targets to be tracked in the sequence of video images, and to determine the position of the moving object in each frame image. One of the most popular target tracking method is to detect tracking, detection and tracking of learning is usually a binary classifier to separate the target and the background based on ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/246G06T7/262G06T7/73G06T5/00G06N3/04G06N3/08
CPCG06T7/246G06T7/262G06T7/73G06N3/08G06T2207/10016G06T2207/20056G06N3/045G06T5/70
Inventor 田小林张艺帆李娇娇高文星王露杨坤焦李成
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products