Twin network target tracking method and system based on convolutional self-attention module

A twin network and target tracking technology, applied in biological neural network models, neural learning methods, character and pattern recognition, etc., can solve problems such as difficult to deal with complex appearance changes, redundant background information, and lost foreground information

Active Publication Date: 2021-11-26
NANCHANG INST OF TECH
View PDF7 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In view of the above situation, it is necessary to solve the problem that some visual tracking algorithms in the prior art ignore the context-related information generated in the time dimension of continuous frames, resulting in the loss of a large amount of foreground information and the generation of redundant background information, which leads to It is difficult to deal with the impact of complex appearance changes

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Twin network target tracking method and system based on convolutional self-attention module
  • Twin network target tracking method and system based on convolutional self-attention module
  • Twin network target tracking method and system based on convolutional self-attention module

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0056] Embodiments of the present invention are described in detail below, examples of which are shown in the drawings, wherein the same or similar reference numerals designate the same or similar elements or elements having the same or similar functions throughout. The embodiments described below by referring to the figures are exemplary only for explaining the present invention and should not be construed as limiting the present invention.

[0057] These and other aspects of embodiments of the invention will become apparent with reference to the following description and drawings. In these descriptions and drawings, some specific implementations of the embodiments of the present invention are specifically disclosed to represent some ways of implementing the principles of the embodiments of the present invention, but it should be understood that the scope of the embodiments of the present invention is not limited by this limit. On the contrary, the embodiments of the present...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a twin network target tracking method and system based on a convolutional self-attention module. The method comprises the following steps of constructing a feature fusion network model; learning the target image features on the template branches and local regions in the target image features of a search region through a convolutional neural network model to obtain the corresponding local semantic information, and aggregating the local semantic information to obtain the global context related information; pre-training the feature fusion network model; using the pre-trained feature fusion network model to extract the target image features and the search area target image features in the template branches, and introducing the features into the classification branches and regression branches of an area suggestion network with the anchor points; carrying out depth cross-correlation convolution calculation to obtain similarity scores; and performing target tracking on the target candidate block with the maximum similarity score. According to the invention, the global matching accuracy of the target image and the target image of the search area can be improved, and the more accurate tracking is realized.

Description

technical field [0001] The present invention relates to the technical field of computer vision and image processing, in particular to a twin network target tracking method and system based on a convolutional self-attention module. Background technique [0002] In the field of computer vision and image processing, visual tracking has many fundamental applications in computer vision. For example, autonomous driving, video surveillance, traffic vehicle monitoring, and human-computer interaction. Therefore, as tracking becomes more practical and real-time, more and more tracking-related applications become more and more common in real life, which also becomes more and more valuable for the research of video tracking technology. [0003] In general, visual tracking remains a challenging task due to many reasons such as appearance change, deformation, fast motion, and occlusion. In recent years, Siamese network trackers based on convolutional neural networks (CNN) have been wide...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/46G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06N3/045G06F18/22G06F18/24G06F18/253G06F18/214
Inventor 王军孟晨晨
Owner NANCHANG INST OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products