Unlock instant, AI-driven research and patent intelligence for your innovation.

Video small target tracking method and device

A target tracking and small target technology, applied in the field of video small target tracking methods and devices, can solve problems such as difficulty in determining empirical parameters, difficulty in weighting, and singleness, and achieve accurate feature fusion, high anti-interference ability and robustness, strong adaptive effect

Active Publication Date: 2021-08-17
WUHAN UNIV
View PDF8 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This type of fusion method has a good effect on the target tracking of a simple single scene, but for some more complex scenes, the empirical parameters are difficult to determine, the adaptive ability of the mathematical indicators is relatively poor, and the appearance model or motion model of the target is facing changes. When , it is difficult to accurately give the appropriate weight
In addition, these two fusion methods only correspond to a single fusion coefficient for each tracking process, and the fusion is only at the image level, and it is difficult to achieve weighted fusion at the pixel level

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Video small target tracking method and device
  • Video small target tracking method and device
  • Video small target tracking method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0142] The present invention first performs parameter training on the tracking model according to the first frame image and its corresponding response graph, obtains the values ​​of each parameter in the tracking model, and then utilizes the trained tracking model to extract the appearance features and motion features of the target frame by frame for target tracking. Tracking, during the tracking process, when certain conditions are met, the tracking model is updated.

[0143] The technical scheme of the invention can adopt computer software to support the automatic operation process. The technical solution of the present invention will be described in detail below in conjunction with the drawings and embodiments.

[0144] An embodiment includes a model training phase, a target tracking phase and a model updating phase:

[0145] 1. the model training phase builds a deep learning network model based on multiple features and self-attention modules, which is used for target trac...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a small target tracking method and device. The method comprises a model training stage, a target tracking stage and a model updating stage. In the model training stage, convolutional neural network parameters in the whole tracking model including a self-attention module are determined; in the tracking stage, the target position is continuously detected according to the trained model; in the model updating stage, when a preset condition is met, parameters of different modules of the tracking model are updated so as to ensure a continuous, accurate and robust tracking effect. According to the method, the target tracking process is executed in combination with multiple features of the moving target, higher anti-jamming capability and robustness are achieved, the weight map corresponding to each feature response map is obtained through the self-attention module constructed by the convolutional neural network, a traditional single fusion coefficient is expanded into a two-dimensional fusion coefficient matrix (called as an attention graph) consistent with a response graph in size, so that feature fusion is more accurate, and the method has higher adaptability to tracking in different scenes.

Description

technical field [0001] The invention relates to the field of target tracking, in particular to a video small target tracking method and device. Background technique [0002] Video target tracking is one of the important research issues in the field of computer vision. It mainly enables the computer to segment the moving target in the video by imitating the motion perception function of the human physiological visual system and using the temporal and spatial correlation of the video sequence. The process of inter-frame correlation of the target, and then extracting dynamic information, automatically obtaining the planar position information of the moving target in each frame of the video and calculating the trajectory of the target. The currently commonly used target tracking methods are generally divided into three steps: (1) extracting certain imaging features of the current frame target and the surrounding environment; (2) combining the extracted features with the position...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/246G06T7/269G06N3/04G06N3/08
CPCG06T7/246G06T7/269G06N3/08G06T2207/10016G06T2207/20081G06T2207/20084G06N3/045
Inventor 陈震中郭雨佳
Owner WUHAN UNIV
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More