Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Target Tracking Method Based on Sparse Representation Based on Multi-Feature Fusion

A multi-feature fusion and sparse representation technology, applied in the field of image processing to reduce computational complexity, improve reliability, and eliminate interference

Active Publication Date: 2021-06-11
NANJING UNIV OF POSTS & TELECOMM
View PDF7 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The proposed target tracking algorithm focuses on one or two of the above difficulties, and there is no algorithm system that considers the above three difficulties at the same time

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Target Tracking Method Based on Sparse Representation Based on Multi-Feature Fusion
  • A Target Tracking Method Based on Sparse Representation Based on Multi-Feature Fusion
  • A Target Tracking Method Based on Sparse Representation Based on Multi-Feature Fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0049] The present invention is described in detail now in conjunction with accompanying drawing. The sparse representation target tracking method based on multi-feature fusion proposed by the present invention, its structure and content are as follows figure 1 shown.

[0050] The present invention is aimed at a video sequence set, and its processing flow is: use the first frame of the video to train the kernel weight, pass the current frame of the video through a particle filter to obtain the particle observation value, and use a model based on multi-feature sparse representation to estimate the particle observation value matrix Sparse means sparse, and use the sparse means coefficients to predict the motion position of the next frame. The detailed implementation steps of the entire target tracking are as follows, in which the sparse modeling flow chart is as follows figure 2 Shown:

[0051] Step 1: Train the kernel weight, the weight training model is:

[0052]

[00...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a sparse representation target tracking method based on multi-feature fusion. The method adopts the sparse representation target tracking method and includes three parts: training kernel weights, obtaining sparse representation of particle observation, and a sparse reconstruction algorithm. The present invention introduces the fisher discrimination standard into the multi-feature kernel function weight training model, which can accurately judge the robustness of the feature vector, and improves the reliability of multi-feature kernel function fusion; makes the choice of self-adaptive hybrid norm highly relevant The particle observation is used for multi-task sparse reconstruction, which can effectively eliminate the interference of non-correlated sampling particles; the reconstruction algorithm has fast convergence and strong robustness.

Description

technical field [0001] The invention belongs to the technical field of image processing technology, and in particular relates to a sparse representation target tracking method based on multi-feature fusion. Background technique [0002] Image signal processing has always been a hot field of research. In the digital age of mobile Internet, the demand for efficient and robust image processing technology has promoted the rapid development of this field. At the same time, researchers in this field have proposed Greater challenge. In recent years, video tracking has well integrated computer vision, pattern recognition, artificial intelligence and other disciplines, and has become a very active branch in the field of vision research. Moving target tracking in the field of video processing is mainly to analyze the video sequence collected by the sensor, extract the moving target of interest in the scene, assign the same mark to the pixel area corresponding to the same target, and ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/246G06K9/62
CPCG06T7/246G06T2207/10016G06F18/2136G06F18/253
Inventor 曹雯雯康彬陈舒康颜俊朱卫平
Owner NANJING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products