Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A moving object tracking method based on multi-feature fusion

A multi-feature fusion and moving target technology, applied in image data processing, instruments, character and pattern recognition, etc., can solve problems such as tracking failure, inability to fully express targets, and tracking performance differences

Active Publication Date: 2019-01-29
KUNMING UNIV OF SCI & TECH
View PDF9 Cites 26 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The technical problem to be solved by the present invention is to provide a moving target tracking method based on multi-feature fusion, which is used to solve the defects that the existing single-feature description target cannot fully express the target, and the tracking performance varies greatly in different scenarios, and Solved the problem that when the filter model is fixedly updated frame by frame, it is easy to add wrong information to the target model and cause tracking failure

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A moving object tracking method based on multi-feature fusion
  • A moving object tracking method based on multi-feature fusion
  • A moving object tracking method based on multi-feature fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0063] Embodiment 1: as figure 1 As shown, the moving target tracking method based on multi-feature fusion, the specific steps of the method are as follows:

[0064] Step1. Initialize the target and select the target area;

[0065] Step2. Extract the histogram of oriented gradient (Histogram of Oriented Gradient, HOG) feature of the target area as a training sample, and extract the color (Color Name, CN) feature of the target area as another training sample. Train the respective positional filter models with the two training samples;

[0066] Step3. Extract two kinds of features in the target area of ​​the new frame to obtain two detection samples, respectively calculate the correlation scores of the two detection samples and the respective position filters trained in the previous step, that is, obtain the response maps of different features;

[0067] Step4. Calculate the peak side lobe ratio of the response graph of different characteristics, and fuse the two characteristic...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a moving object tracking method based on multi-feature fusion, belonging to the field of computer vision. At first, in that first frame image, the target area is initialized,and two position filters are respectively train by using the direction histogram and the color features; secondly, the detection samples of two features are extracted around the target in the subsequent frame, and the correlation scores between the two detection samples and the position filters trained in the previous step are calculated respectively, that is to say, the response diagrams of different features are obtained. Thirdly, according to the peak sidelobe ratios of different characteristic response diagrams, the two characteristic response values are weighted and fused, and the point with the largest response value is selected as the current center position of the target. Then the scale pyramid training scale filter is constructed by using the directional gradient histogram feature, and the maximum response point is obtained as the current scale of the target. Finally, according to the peak-to-side ratio of the final response graph of each frame, whether occlusion occurs or notis judged. In the case of occlusion, the position filter is not updated.

Description

technical field [0001] The invention discloses a moving target tracking method based on multi-feature fusion, which belongs to the field of computer vision. Background technique [0002] Object tracking is a hot spot in the field of computer vision, which is widely used in video surveillance, robot learning, industrial intelligence, etc. Its essence is to find the position and state of the target in a continuous video sequence image. Although object tracking has made great progress, it is still a challenging problem due to many factors such as occlusion, illumination change and scale change. [0003] In recent years, due to the remarkable effect of the correlation filter algorithm, many scholars have introduced the correlation filter into the target tracking framework. The feature selection in the correlation filter target tracking algorithm has a great influence on the tracking performance. Among them, the Minimun Output Sum of Square Error (MOSSE) algorithm proposed by ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/246G06K9/62
CPCG06T7/246G06T2207/10016G06F18/253
Inventor 尚振宏益争祝玛
Owner KUNMING UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products