Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Depth correlation target tracking algorithm based on mutual reinforcement and multi-attention mechanism learning

A target tracking and attention mechanism technology, applied in the field of image processing, can solve problems such as information redundancy, insufficiency, single low-level feature expression, etc., to achieve the effects of improving robustness, improving effective distribution, and alleviating boundary effects

Active Publication Date: 2019-08-13
NANJING UNIV OF INFORMATION SCI & TECH
View PDF2 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Aiming at the problems of insufficient expressive power of a single low-level feature and redundant information in the current tracking algorithm, the present invention proposes a depth-related target tracking algorithm based on mutual reinforcement and multi-attention mechanism learning

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Depth correlation target tracking algorithm based on mutual reinforcement and multi-attention mechanism learning
  • Depth correlation target tracking algorithm based on mutual reinforcement and multi-attention mechanism learning
  • Depth correlation target tracking algorithm based on mutual reinforcement and multi-attention mechanism learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0034] The depth correlation tracking algorithm of mutual reinforcement and multi-attention mechanism learning provided in this embodiment, the process is as follows figure 1 , figure 2 , image 3 , Figure 4 , Figure 5 Shown, specifically include the following steps:

[0035] S1: Input the previous frame to get the target area;

[0036] S2: Establish a feature extractor that is beneficial to tracking, and extract the target area features of S1 through the feature extractor, such as image 3 , the feature extractor is composed of a convolutional network, an encoder and a decoder. It is improved on the original tracking algorithm DCFNet. The original DCFNet algorithm only contains shallow features obtained by two convolutions, and the encoder and decoder are added. The EDNet structure of the decoder extracts high-level semantic information, which combines shallow features and sends them to channel attention and spatial attention mechanisms after fusion, such as Figure ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a depth correlation target tracking algorithm based on mutual reinforcement and multi-attention mechanism learning. The algorithm comprises the following steps: inputting a previous frame of target area and a next frame of search area; initializing parameters of the feature extractor, learning the parameters of the feature extractor through gradient descent through mean square error loss, and extracting features from the search area through the feature extractor; calculating autocorrelation of the features, and learning a filter template through a ridge regression closed solution; enabling the next frame to determine a search area through a target position speculated by the previous frame, carrying out feature extraction through a designed feature extractor, and calculating cross-correlation of features of the target area and the search area; carrying out related operation on the characteristics and a filter template, wherein the maximum value in the output values is the latest position of target tracking; and each frame learns and updates the target filter template.

Description

technical field [0001] The invention belongs to the field of image processing, and in particular relates to a depth-related target tracking algorithm based on mutual reinforcement and multi-attention mechanism learning. Background technique [0002] Object tracking is one of the core issues in the field of computer vision and has a wide range of applications, such as human motion analysis, video surveillance, and automatic driving. Although a large number of tracking algorithms have been proposed for various scenarios, robust visual tracking systems are still elusive due to factors such as deformation, occlusion, illumination changes, background clutter, and fast motion. [0003] In recent years, many target tracking algorithms based on correlation and depth features have emerged, which can perform better video single target tracking. The representative one is the deep correlation tracking algorithm of mutual reinforcement and multi-attention mechanism learning. However, th...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/246G06T7/238G06T7/262
CPCG06T7/246G06T7/262G06T7/238G06T2207/10016G06T2207/20056G06T2207/20081G06T2207/20084
Inventor 宋慧慧周双双张晓露张开华汤润发
Owner NANJING UNIV OF INFORMATION SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products