Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-appearance model fusion target tracking method and device based on sparse representation

An appearance model and target tracking technology, which is applied in the field of target tracking, can solve problems such as tracking failure, achieve high update efficiency, improve adaptability, and reduce the amount of calculation

Active Publication Date: 2020-01-03
ANHUI UNIVERSITY
View PDF9 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In order to solve the technical problem that the existing target tracking method is prone to tracking failure in the scene where the target is similar to the background, occlusion, illumination change, pose change or rotation, the present invention provides a multi-appearance model fusion target tracking based on sparse representation method and device

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-appearance model fusion target tracking method and device based on sparse representation
  • Multi-appearance model fusion target tracking method and device based on sparse representation
  • Multi-appearance model fusion target tracking method and device based on sparse representation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0071] see figure 1 , the present invention provides a multi-appearance model fusion target tracking method based on sparse representation, and the method is used to track the tracking target in the tracking image. Wherein, the target tracking method of this embodiment includes the following steps ((1)-(5)).

[0072] (1) Build a particle filter framework, and determine the state of the tracking target through multiple affine parameters to build a motion model for the state transition of the tracking target. In this embodiment, the particle filter in the particle filter framework essentially implements a recursive Bayesian estimation through a non-parametric Monte Carlo simulation, that is, a random sample set with corresponding weights is used to approximate the subsequent state of the system. test probability density. Particle filtering generally carries out the following steps: ①Initial state: Simulate X(t) with a large number of particles, and the particles are uniformly ...

Embodiment 2

[0122] This embodiment provides a multi-appearance model fusion target tracking method based on sparse representation. This method has carried out simulation experiments on the basis of Embodiment 1 (in other embodiments, simulation experiments may not be performed, and other experimental schemes may also be used. Conduct experiments to determine relevant parameters and target tracking performance).

[0123] In this embodiment, IntelCore3.2GHz, 4GB memory, MATLAB2010b platform carry out simulation experiments on the tracking method, the tracking method is carried out under the particle filter framework, the number of particles is 600, and the size of each target image block is 32*32 pixels, from the target area The sizes of the extracted local image blocks are 16*16 pixels and 8*8 pixels respectively. n 1 Corresponding to 16*16 size local block weight coefficient, η 2 Corresponding to the 8*8 size local block weight coefficient. We select 10 representative videos for experim...

Embodiment 3

[0150] This embodiment provides a multi-appearance model fusion object tracking device based on sparse representation, which applies the sparse representation-based multi-appearance model fusion object tracking method in Embodiment 1 or Implementation 2. Wherein, the object tracking device of this embodiment includes a particle filter framework building module, a global sparse appearance model building module, a local sparse appearance model building module, a fusion module and a template updating module.

[0151] The particle filter frame building module is used to build a particle filter frame, and determine the state of the tracking target through multiple affine parameters to build a motion model for tracking the state transition of the target.

[0152] The global sparse appearance model building module is used in the particle filter framework to first determine the target template set and candidate target set of the tracking image, and then use the target template set and th...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-appearance model fusion target tracking method and device based on sparse representation. The method comprises the steps that a particle filtering framework is built; firstly, a target template set and a candidate target set of a tracking image are determined, and then the target template set and a trivial template are subjected to linear programming to generate a sparse coefficient solution; establishing a local sparse appearance model; fusing the global sparse appearance model and the local sparse appearance model: firstly calculating two similarity elements,then calculating a weighted sum of the two elements as fusion similarity, calculating a reconstruction error, and taking the weighted sum of the fusion similarity and the reconstruction error as a discrimination function; and endowing the target template of the target template set with a weight in direct proportion to the importance of the target template, and updating the target template of the target template set. According to the method, the time complexity is reduced, high adaptability is achieved under the conditions that the appearance of the target is greatly changed and the shielding area is large, the target is accurately and robustly tracked, the shielding processing capacity is achieved, and the appearance model updating efficiency is high.

Description

technical field [0001] The present invention relates to a target tracking method in the technical field of target tracking, in particular to a sparse representation-based multi-appearance model fusion target tracking method, and also relates to a sparse representation-based multi-appearance model fusion target tracking device for the tracking method. Background technique [0002] Object tracking aims to estimate the state of moving objects in video sequences, and computer vision-based object tracking can be widely used in security monitoring, unmanned vehicle navigation, human-computer interaction, and behavior detection, etc. Due to the variability of the target and the complexity of the scene, how to design a target tracking method to deal with target tracking in a complex and dynamic environment caused by factors such as partial occlusion, illumination changes, deformation, size changes, camera movement, background interference, and angle changes. Still challenging. [0...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/20
CPCG06T7/20G06T2207/10016
Inventor 汪芳周健
Owner ANHUI UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products