Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-scale filtering target tracking method based on adaptive feature fusion

A feature fusion and target tracking technology, applied in the field of target tracking, can solve problems such as motion blur, achieve the effect of enhancing precision and accuracy, improving accuracy and success rate, and improving tracking accuracy

Active Publication Date: 2021-07-27
YANSHAN UNIV
View PDF11 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The purpose of the present invention is to solve the problems of KCF target tracking algorithm in the tracking of target deformation, rotation, motion blur, etc., and improve the accuracy and success rate of KCF tracking algorithm in target tracking

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-scale filtering target tracking method based on adaptive feature fusion
  • Multi-scale filtering target tracking method based on adaptive feature fusion
  • Multi-scale filtering target tracking method based on adaptive feature fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033] Hereinafter, embodiments of the present invention will be described with reference to the drawings.

[0034] Such as figure 1 The method of the present invention is shown to comprise the following steps:

[0035] Step 1. Read and process the video:

[0036] First initialize the network module, then read the network video, obtain Q image frames in the video stream, convert the RGB image to grayscale and extract infrared features and depth features to form a three-channel multi-modal image as a template. And crop a search window of size n×n centered on the estimated position in the previous frame.

[0037] Step 2. Extract features and fuse them. The specific steps are as follows:

[0038] Step 2.1 adopts training on ImageNet, figure 2 VGG19 is shown for feature extraction. On the image frame of a given search window of size n×n, set a space size of Used to adjust the feature size of each convolutional layer. First delete the fully connected layer, and use the out...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a multi-scale filtering target tracking method based on adaptive feature fusion, and relates to the technical field of target tracking. The method comprises the following steps: initializing a network model, and reading a video to obtain a plurality of image frames; target position prediction: firstly, selecting image blocks around a target, extracting VGG-19 network layer features and CN features, broadcasting the CN features, then directly connecting and fusing the CN features with conv2-2 layer features of the VGG-19 network in series, classifying the three layers of features through a classifier, outputting a response through a filtering model, and finally evaluating the target position by using empirical weight weighting fusion response from coarse to fine, and updating the target filter parameters. According to the method, filtering is carried out by combining the hierarchical features of the VGG19 network with the color attribute CN features, responses are output respectively, the maximum response is taken as the reference, and the target position is evaluated from coarse to fine weighted fusion responses. The OPE detection result shows that the method is reliable and high in precision in the aspects of target deformation, illumination variation, motion blur and plane rotation.

Description

technical field [0001] The invention relates to target tracking in the field of artificial intelligence, in particular to a multi-scale filtering target tracking method based on adaptive feature fusion. Background technique [0002] With the rapid development of artificial intelligence technology, the application of object tracking technology in real life in video surveillance, human-computer interaction, motion analysis, activity recognition, etc. is becoming more and more extensive. At the same time, it also faces problems such as rapid target movement, target occlusion, complex background changes, shape changes, and noise interference. [0003] Common target tracking algorithms are mainly divided into two categories, one is the target tracking algorithm based on depth features, the other is the target tracking algorithm based on correlation filtering, and the target tracking algorithm based on depth features, deep features can extract good images features, but the traini...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/246G06K9/62G06N3/04G06N3/08
CPCG06T7/246G06N3/08G06T2207/10016G06T2207/20081G06T2207/20084G06T2207/20221G06N3/045G06F18/241G06F18/253G06F18/214
Inventor 张立国杨曼李枫金梅周思恩刘强李媛媛马子荐张淑清
Owner YANSHAN UNIV
Features
  • Generate Ideas
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More