Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-scene target tracking method based on adaptive depth feature filter

A deep feature and target tracking technology, applied in the field of computer vision, can solve the problems of fixed depth feature filter weight, inability to adapt to a variety of complex scenes, and inability to integrate depth features well

Active Publication Date: 2019-08-16
NANJING UNIV
View PDF3 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The problem to be solved by the present invention is: the boundary effect existing in the existing video target tracking technology cannot well integrate the depth feature, and the depth feature filter weight is fixed, The learning rate of the model is fixed and cannot adapt to a variety of complex scenarios.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-scene target tracking method based on adaptive depth feature filter
  • Multi-scene target tracking method based on adaptive depth feature filter
  • Multi-scene target tracking method based on adaptive depth feature filter

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0064] The invention proposes a multi-scene target tracking method based on an adaptive depth feature filter. A target tracking system is implemented using MATLAB programming language. The system automatically marks the system-predicted target area in subsequent frames by reading the video with the target area marked in the first frame.

[0065] figure 1 It is the video object tracking process of the embodiment of the present invention. The specific implementation steps of the present invention are as follows:

[0066] 1. Generate training examples. The training samples of the first frame are manually marked tracking target areas, and the training samples of subsequent frames are the predicted target areas. On the training samples, a circular matrix is ​​used to generate positive and negative samples for training deep features. filter;

[0067] 2. Adaptively extract foreground objects. The target area contains a lot of background noise, the Hamming window cannot alleviat...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-scene target tracking method based on an adaptive depth feature filter. The method comprises the following steps of converting a target area of an original image into acolor naming space from an RGB space to reduce the interference of color changes, then, calculating a foreground probability graph of a target area, t and training by using the features extracted from the foreground area according to the foreground probability graph to alleviate the boundary effect and effectively inhibit the noise in the background, so that the method can adaptively extract thetarget features. According to the method, multiple layers of depth features are used for training in corresponding related filters respectively; and the weight of the corresponding depth feature filter is updated adaptively according to the tracking effect, the filter stability, the historical response and other information, and the tracking model is guided to adaptively select useful depth features in different scenes, so that the target can be tracked in a robust manner in various complex scenes. Compared with the prior art, the method has the advantage of high robustness, and can accurately track the target in various complex scenes.

Description

technical field [0001] The invention belongs to the technical field of computer vision, relates to video target tracking in multimedia technology, and is a multi-scene target tracking method based on an adaptive depth feature filter. Robust tracking of objects in the scene. Background technique [0002] The task of video target tracking is to use the marked target area information in the first frame to automatically predict the target area information in subsequent frames, which includes the position and size of the target area. Video object tracking is one of the areas that researchers focus on, and has already achieved many practical applications in real life, such as eye tracking, automatic driving, and intelligent monitoring. In general, according to different target tracking models, target tracking algorithms can be roughly divided into generative model-based tracking and discriminative model-based tracking. Generally speaking, a typical generative model-based object ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06K9/46
CPCG06V10/443G06F18/211G06F18/214
Inventor 武港山徐鹏飞
Owner NANJING UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products