Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Discriminative online target tracking method based on videos in dictionary learning

A technology of target tracking and dictionary learning, which is applied in the field of image processing and can solve problems such as poor performance

Inactive Publication Date: 2014-08-13
SHANGHAI JIAO TONG UNIV
View PDF6 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0009] The present invention aims at the above-mentioned deficiencies existing in the prior art, and proposes a discriminative online target tracking method in video based on dictionary learning, which changes the problem of poor performance of traditional tracking methods in complex backgrounds, and is suitable for complex scenes (severe motion, illumination changes, noise, partial occlusion and attitude changes, etc.) to achieve robust target tracking

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Discriminative online target tracking method based on videos in dictionary learning
  • Discriminative online target tracking method based on videos in dictionary learning
  • Discriminative online target tracking method based on videos in dictionary learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0041] Step 1. Construct the template of the target through manual calibration or existing tracking results, and obtain the status of the target to be tracked at the same time;

[0042] Step 2: Construct the target initial state matrix, specifically: let the initial time t=0 to t=t M-1 The state of the target to be tracked is known, and M templates with a size of d×d are formed, all of which have been averaged and normalized. At this time, the state of the target to be tracked is That is, the target initial state matrix Among them: t0 is the initial state.

[0043] Step 3: Motion modeling: According to the basic principle of Affine Warping, based on the affine parameters, sampling is performed through a normal distribution with zero mean and variance predefined to form candidate samples (particles);

[0044] If the parameters of the target state at adjacent moments conform to the Gaussian distribution, satisfying: Then according to the preset Ψ 0 Sampling is performed...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the field of image processing technologies and relates to a discriminative online target tracking method based on videos in dictionary learning. The method includes the first step of constructing a template of a target through manual calibration or according to an existing tracking result and obtaining the state of the target to be tracked at the same time, the second step of constructing an initial stage matrix of the target, conducting motion modeling and conducting sampling to form candidate samples, the third step of partitioning the candidate samples and the template, conducting sparse coding on all partitions and obtaining expression coefficients of the corresponding partitions, the fourth step of inputting the expression coefficients as features into a classifier of the corresponding partition, obtaining the determination confidence degree of each candidate sample in the partition, randomly selecting K determination confidence degrees of the partitions, conducting summation, traversing all possibilities and selecting the candidate sample with the maximum summation value as the target tracking result of a current frame, wherein the selected frequency of the candidate sample in all the possibilities are maximum. Through the method, the accuracy and robustness of target tracking on the condition of a complicated background and changing of target appearance can be improved.

Description

technical field [0001] The invention relates to a method in the technical field of image processing, in particular to a method for discriminative online target tracking in video based on dictionary learning. Background technique [0002] In recent years, video surveillance technology has been widely used. Target tracking integrates knowledge and technologies in many related fields such as computer image processing, pattern recognition, artificial intelligence, and automatic control. Hierarchy analysis and understanding provide important data basis. Stable and accurate tracking of targets in complex environments is an urgent problem to be solved in the current research and application of tracking technology, and traditional tracking methods often perform poorly. In recent years, a class of methods that can adaptively update the relevant parameters of the tracker as the target and the environment change, called online target tracking, has received extensive attention. This ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/20
Inventor 郑世宝薛明李宏波丁正彦朱文婕陈宇航
Owner SHANGHAI JIAO TONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products