Unlock instant, AI-driven research and patent intelligence for your innovation.

A sparse representation-based multi-appearance model fusion target tracking method and device

An appearance model, target tracking technology, applied in the field of target tracking, can solve problems such as tracking failure, and achieve the effect of high update efficiency, small calculation amount, and high adaptability

Active Publication Date: 2022-02-01
ANHUI UNIVERSITY
View PDF9 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In order to solve the technical problem that the existing target tracking method is prone to tracking failure in the scene where the target is similar to the background, occlusion, illumination change, pose change or rotation, the present invention provides a multi-appearance model fusion target tracking based on sparse representation method and device

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A sparse representation-based multi-appearance model fusion target tracking method and device
  • A sparse representation-based multi-appearance model fusion target tracking method and device
  • A sparse representation-based multi-appearance model fusion target tracking method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0071] see figure 1 , the present invention provides a multi-appearance model fusion target tracking method based on sparse representation, and the method is used to track the tracking target in the tracking image. Wherein, the target tracking method of this embodiment includes the following steps ((1)-(5)).

[0072] (1) Build a particle filter framework, and determine the state of the tracking target through multiple affine parameters to build a motion model for the state transition of the tracking target. In this embodiment, the particle filter in the particle filter framework essentially implements a recursive Bayesian estimation through a non-parametric Monte Carlo simulation, that is, a random sample set with corresponding weights is used to approximate the subsequent state of the system. test probability density. Particle filtering generally carries out the following steps: ①Initial state: Simulate X(t) with a large number of particles, and the particles are uniformly ...

Embodiment 2

[0122] This embodiment provides a multi-appearance model fusion target tracking method based on sparse representation. This method has carried out simulation experiments on the basis of Embodiment 1 (in other embodiments, simulation experiments may not be performed, and other experimental schemes may also be used. Conduct experiments to determine relevant parameters and target tracking performance).

[0123] In this embodiment, IntelCore3.2GHz, 4GB memory, MATLAB2010b platform carry out simulation experiments on the tracking method, the tracking method is carried out under the particle filter framework, the number of particles is 600, and the size of each target image block is 32*32 pixels, from the target area The sizes of the extracted local image blocks are 16*16 pixels and 8*8 pixels respectively. n 1 Corresponding to 16*16 size local block weight coefficient, η 2 Corresponding to the 8*8 size local block weight coefficient. We select 10 representative videos for experim...

Embodiment 3

[0150] This embodiment provides a multi-appearance model fusion object tracking device based on sparse representation, which applies the sparse representation-based multi-appearance model fusion object tracking method in Embodiment 1 or Implementation 2. Wherein, the object tracking device of this embodiment includes a particle filter framework building module, a global sparse appearance model building module, a local sparse appearance model building module, a fusion module and a template updating module.

[0151] The particle filter frame building module is used to build a particle filter frame, and determine the state of the tracking target through multiple affine parameters to build a motion model for tracking the state transition of the target.

[0152] The global sparse appearance model building module is used in the particle filter framework to first determine the target template set and candidate target set of the tracking image, and then use the target template set and th...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-appearance model fusion target tracking method based on sparse representation and a device thereof. The method comprises: building a particle filter framework; first determining a target template set and a candidate target set of a tracking image; The template generates sparse coefficient solutions through linear programming; builds a local sparse appearance model; fuses the global sparse appearance model with the local sparse appearance model: first calculates two similarity elements, then calculates the weighted sum of the two elements as the fusion similarity, and calculates The weighted sum of reconstruction error, fusion similarity and reconstruction error is used as a discriminant function; the target template of the target template set is given a weight proportional to its importance, and the target template of the target template set is updated. The invention reduces time complexity, has high adaptability when the appearance of the target changes greatly and the occlusion area is large, accurately and robustly tracks the target, has occlusion processing capability, and has high update efficiency of the appearance model.

Description

technical field [0001] The present invention relates to a target tracking method in the technical field of target tracking, in particular to a sparse representation-based multi-appearance model fusion target tracking method, and also relates to a sparse representation-based multi-appearance model fusion target tracking device for the tracking method. Background technique [0002] Object tracking aims to estimate the state of moving objects in video sequences, and computer vision-based object tracking can be widely used in security monitoring, unmanned vehicle navigation, human-computer interaction, and behavior detection, etc. Due to the variability of the target and the complexity of the scene, how to design a target tracking method to deal with target tracking in a complex and dynamic environment caused by factors such as partial occlusion, illumination changes, deformation, size changes, camera movement, background interference, and angle changes. Still challenging. [0...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/20
CPCG06T7/20G06T2207/10016
Inventor 汪芳周健
Owner ANHUI UNIVERSITY