Target long-time tracking method based on content retrieval

A target tracking and long-term technology, applied in neural learning methods, image data processing, instruments, etc., can solve problems such as target deformation, occlusion and out of view, and achieve the effect of improving robustness and efficiency

Pending Publication Date: 2022-02-08
ZHEJIANG DALI TECH
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] However, during long-term tracking, the target will inevitably be deformed, occluded, and out of view. Networks such as SiamFC only extract depth features from the init

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Target long-time tracking method based on content retrieval
  • Target long-time tracking method based on content retrieval
  • Target long-time tracking method based on content retrieval

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0039] The single target tracking method combined with historical trajectory information proposed by the present invention will be further described below in conjunction with the accompanying drawings and specific embodiments. Advantages and features of the present invention will become apparent from the following description and claims.

[0040] A long-term target tracking method based on content retrieval provided by the present invention, the method performs the following steps on each frame search image:

[0041] S1. Use an offline target tracking network for target tracking processing to obtain a classification feature map, and record the target content of the initial frame during tracking as a target template;

[0042] Using the offline target tracking network for target tracking processing, the specific steps to obtain the classification feature map are as follows:

[0043] S1.1. Obtain the template image and the current frame search image; the template image is manual...

Embodiment

[0067] A specific embodiment of the present invention provides the training process of the above-mentioned neural network and the application process of the long-term target tracking method based on content retrieval provided by the present invention.

[0068] (1), data set acquisition and preprocessing

[0069] Select the training data set, and perform size normalization and data enhancement processing on the image input to the network.

[0070] Specific implementation methods, the commonly used data set ILVSRC2015 in the field of single target tracking and 800 videos that were actually shot and marked independently are used as training data. The size normalization and data enhancement methods are as follows:

[0071] According to the first frame of the template image, the real target frame (x min ,y min ,w,h), where x min and y min Respectively represent the point position coordinates of the upper left corner of the ground truth box. w and h represent the width and hei...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a target long-time tracking method based on content retrieval, and the method carries out the following steps on each frame of search image: S1, carrying out target tracking processing through employing an offline target tracking network, obtaining a classification feature map, and recording the initial frame target content during tracking as a target template; S2, if the maximum response value of the classification feature map is greater than a preset threshold value, repeating the steps S1 to S2, and if the maximum response value of the classification feature map is less than or equal to the preset threshold value, performing global search on the whole search image by adopting a long-time tracking method to obtain L candidate target areas; S3, inputting L candidate targets into a content retrieval network to obtain feature vectors of L candidate areas, and inputting the target template recorded in the step S1 into the content retrieval network to obtain a matching vector z; and S4, respectively calculating cosine similarities of the feature vectors of the L candidate regions and the matching vectors z, and if the maximum value of the cosine similarities exceeds a preset threshold value, taking the candidate target region corresponding to the maximum value of the cosine similarities as a target tracking object, and repeatedly executing the steps S1 to S4.

Description

technical field [0001] The invention relates to a long-term target tracking method based on content retrieval, aiming at solving the problem of target tracking when the tracked object disappears briefly and then appears in the field of vision. Background technique [0002] Object tracking is a longstanding, fundamental, and challenging problem in computer vision that has been studied in this field for decades. Target tracking is divided into single target tracking and multi-target tracking. The task of single target tracking is defined as: given the size and position of the target in the initial frame of a video sequence, predict the size and position of the target in subsequent frames. The definition of multi-target tracking is: given an image sequence, find the moving object in the image sequence, and correspond the moving objects in different frames one by one, and then give the object's trajectory. [0003] According to the different modeling methods of target models, ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T7/246G06K9/62G06N3/04G06N3/08G06V10/764G06V10/74G06V10/774
CPCG06T7/246G06N3/08G06N3/045G06F18/241G06F18/214
Inventor 杨兆龙庞惠民车宏
Owner ZHEJIANG DALI TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products