Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Object Tracking Method Based on Self-Attention Transformation Network

A target tracking and attention technology, applied in the field of target tracking, can solve the problem of insensitive parameter setting, and achieve the effect of stable tracking effect.

Active Publication Date: 2022-02-18
中国人民解放军32802部队
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In order to solve the problem of a priori anchor frame in the existing video object tracking method or avoid the use of a priori anchor frame, the effect of the video object tracking method in the test set or practical application is not sensitive to the parameter setting of the a priori anchor frame , so as to have a stable and accurate tracking effect. The present invention discloses a target tracking method based on a self-attention transformation network, which is realized by using a search image coding module, a target coding module, a decoding module, and a supervision module;

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Object Tracking Method Based on Self-Attention Transformation Network
  • Object Tracking Method Based on Self-Attention Transformation Network
  • Object Tracking Method Based on Self-Attention Transformation Network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0028] Below in conjunction with accompanying drawing and specific implementation case, the present invention is described in further detail:

[0029] refer to figure 1, the present invention discloses a target tracking method based on a self-attention transformation network, which is realized by using a search image coding module, a target coding module, a decoding module, and a supervision module;

[0030] The described search image encoding module is implemented by a multi-head self-attention network and a feed-forward network in series, and the multi-head self-attention network is formed by parallel connection of n self-attention networks. When the search image block is input into the search image encoding module, the search The image encoding module calculates and searches image block encoding through multi-head self-attention network and feedforward network, and searches image block encoding The calculation formula is:

[0031]

[0032] Wherein X is the search imag...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a target tracking method based on a self-attention transformation network, which is realized by using a search image encoding module, a target encoding module, a decoding module, and a supervision module; the search image encoding module is composed of a multi-head self-attention network and a feedforward The network is implemented in series, and the search image block encoding is calculated through the multi-head self-attention network and the feed-forward network; the target encoding module calculates the target feature encoding through the masked multi-head self-attention network; the decoding module is based on the target feature encoding, through the multi-head attention network in the search In the image feature encoding, the query matching is performed to calculate the position coordinates of the target prediction frame; the supervision module calculates the error between the two according to the position information of the target prediction frame and the real target position information, and the neural network parameters obtained when the error is minimized are used for target tracking. The invention has a stable tracking effect, is easier to capture deformed targets in search images, and generates accurate tracking results.

Description

technical field [0001] The invention relates to the field of target tracking methods, in particular to a target tracking method based on a self-attention transformation network. Background technique [0002] With the development and popularization of video equipment, many fields based on analyzing video have emerged, such as security monitoring, traffic control, autonomous driving, etc. The development of these fields will greatly improve the quality of life. Object tracking technology is one of the important components in the field of video analysis. Although it has achieved excellent tracking results, it often fails to track in the face of complex actual scenes. Therefore, the current object tracking technology needs to further improve its ability to adapt to the actual scene. [0003] In the target tracking task, the target to be tracked is arbitrarily specified, and the shape of the target is also arbitrary. In order to cover objects of arbitrary shapes, current tracki...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/246
CPCG06T7/248G06T2207/10016G06T2207/20081G06T2207/20084
Inventor 赵健温志津刘阳鲍雁飞雍婷范娜娜李晋徽晋晓曦张清毅
Owner 中国人民解放军32802部队
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products