Target tracking method based on multi-feature map fusion and multi-scale expansion convolution

A target tracking and feature map technology, applied in the field of image processing, can solve problems such as illumination changes, background similarity interference, target deformation prediction accuracy, etc., and achieve the effect of strong perception and accuracy improvement

Inactive Publication Date: 2021-07-16
CHONGQING UNIV OF POSTS & TELECOMM
View PDF6 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Although there are many tracking methods, there are still many challenges in the actual scene, such as illumination changes, background similarity interference, occlusion, target deformation problems, etc.
Although the previous tracking method based on convolutional neural network has greatly improved the real-time performance, there are still some problems with the deformation of the target and the accuracy of prediction.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Target tracking method based on multi-feature map fusion and multi-scale expansion convolution
  • Target tracking method based on multi-feature map fusion and multi-scale expansion convolution
  • Target tracking method based on multi-feature map fusion and multi-scale expansion convolution

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0036] The technical solutions in the embodiments of the present invention will be described clearly and in detail below with reference to the drawings in the embodiments of the present invention. The described embodiments are only some of the embodiments of the invention.

[0037] The technical scheme that the present invention solves the problems of the technologies described above is:

[0038] Specific steps:

[0039] S1. Send the image of the target area to be tracked and the image of the search area in the current frame that have been marked in the initial frame of the video to the same feature extraction network, and obtain three output feature maps respectively;

[0040] S2. Convolute and fuse the three output feature maps of the image of the target area and the image with the search area to obtain fused feature maps respectively;

[0041] S3. Perform cross-correlation operation on the fused two images to obtain two characteristic response graphs;

[0042] S4. Output...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a target tracking method based on multi-feature map fusion and multi-scale expansion convolution. The method comprises the steps: S1, transmitting a to-be-tracked target region image with a marked position in an initial frame and a current frame search region image into a same feature extraction network, and obtaining three output feature maps respectively; s2, fusing the three output feature maps of the two images respectively; s3, performing cross-correlation operation on the two fused images to obtain two feature response diagrams; s4, obtaining feature response diagrams of different receptive fields for the two feature response diagrams through three expansion convolution layers with different scales; s5, carrying out point-by-point fusion on the related features; s6, respectively sending the fused feature map into a classification branch and a regression branch; and S7, predicting and representing the position of the target to be tracked in the current frame in combination with the maximum response region of the classification branch and the target movement amount of the regression branch. According to the method, the robustness and accuracy of the tracking method under the complex conditions of large target scale change and the like are improved.

Description

technical field [0001] The invention belongs to the technical field of image processing, and in particular relates to a multi-feature map fusion based on a convolutional neural network and a target tracking method for multi-scale expansion convolution. Background technique [0002] The target tracking method belongs to the field of computer vision, and has a wide range of applications in military security entertainment and many other aspects. Target tracking methods are mainly divided into two categories: tracking methods based on traditional methods and tracking methods based on convolutional neural networks. [0003] Among the tracking methods based on traditional methods, correlation filtering is the most representative. The core idea is to use the circular matrix of the area around the target to make positive and negative sample sets during training, and use ridge regression to train a target detector, and use the detector to identify the position of the target during t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/33G06N3/04
CPCG06T7/33G06T2207/20221G06T2207/10016G06N3/045
Inventor 李伟生朱俊烨
Owner CHONGQING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products