Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Adaptive simultaneous sparse representation-based robust target tracking method

A sparse representation and target tracking technology, applied in image data processing, instrument, character and pattern recognition, etc., can solve the problems of high feature extraction requirements, weakening the performance of the classifier, and inability to accurately search, to reduce redundant template vectors, Reduce the effect of noise, reduce the effect of dimension

Active Publication Date: 2018-03-27
UNIV OF ELECTRONIC SCI & TECH OF CHINA
View PDF5 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although the discriminant method and the generative method have achieved reliable tracking to a certain extent, they also have their own shortcomings. First, the discriminant method has high requirements for feature extraction, so it is sensitive to noise in the actual tracking process, and it may appear for noisy targets. Tracking fails, and the generation method cannot accurately find areas similar to the target in the mixed background, so it is prone to tracking failure; second, the discriminant method needs sufficient training sample sets, good samples can improve the performance of the classifier, and bad samples It will weaken the performance of the classifier. If bad samples are introduced into the classifier, the tracking effect will be affected, and the generation method is more sensitive to the template. Once the occluded target is introduced into the template by mistake, tracking failure may occur. Therefore, the two methods are in the real scene. are not sufficiently robust in tracking

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Adaptive simultaneous sparse representation-based robust target tracking method
  • Adaptive simultaneous sparse representation-based robust target tracking method
  • Adaptive simultaneous sparse representation-based robust target tracking method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0049] In the following, the robust target tracking method based on adaptive simultaneous sparse representation of the present invention will be further explained in conjunction with specific embodiments.

[0050] The robust target tracking method based on adaptive simultaneous sparse representation includes the following steps:

[0051] S1, according to the magnitude of the Laplace noise energy, adaptively establish a simultaneous sparse tracking model

[0052] Contrast Laplace Mean Noise||S|| 2 With the given noise energy threshold τ, and based on the comparison results, the simultaneous sparse tracking model is established adaptively:

[0053] When ||S|| 2 When ≤τ, the simultaneous sparse tracking model is:

[0054]

[0055] When ||S|| 2 > At τ, the simultaneous sparse tracking model is:

[0056]

[0057] Among them, the definition D=[T,I] represents the tracking template, I represents the trivial template, the image collection of the given target template

[0058] T=[T 1 ,T 2 ,...,T n...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an adaptive simultaneous sparse representation-based robust target tracking method. The method comprises the following steps of S1, according to the value of Laplacian noise energy, adaptively building a simultaneous sparse tracking model; S2, solving the built tracking model; and S3, updating a template. The tracking method provided by the invention is good in tracking identification effect and relatively good in interference resistance, can realize relatively accurate and real-time target tracking, and is relatively stable.

Description

Technical field [0001] The invention belongs to the technical field of computer image processing, and relates to a target tracking method, and more specifically, to a robust target tracking method based on adaptive simultaneous sparse representation. Background technique [0002] Target tracking occupies an important position in the field of computer vision. With the use of high-quality computers and cameras and the need for automatic video analysis, people are interested in target tracking. The main tasks of target tracking include: detection of moving targets of interest, continuous tracking from frame to frame, and behavior analysis of tracking targets. Currently, relevant applications of target tracking include: motion recognition, video retrieval, human-computer interaction, traffic monitoring, car navigation, etc. [0003] Currently, although many tracking algorithms have been proposed, target tracking technology still faces many challenges. In the actual tracking process, ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62G06T7/246
CPCG06T7/251G06V20/42G06F18/23213
Inventor 樊庆宇李厚彪羊恺王梦云陈鑫李滚
Owner UNIV OF ELECTRONIC SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products