High-speed twin network target tracking method based on positioning perception

A twin network and target tracking technology, which is applied in the field of computer vision, can solve the problems that the features do not have strong discriminative power, the tracking speed is slow, and the number of parameters is large, and the effect of excellent tracking performance is achieved.

Pending Publication Date: 2022-06-24
CHONGQING UNIV OF POSTS & TELECOMM
View PDF1 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, there is also a very obvious problem with these tracking: under the RTX1080ti-based device, the tracking speed of SiamRPN++ is 35 frames (FPS), the tracking speed of Ocean is 56 frames, and the tracking speed of SiamAttn is 33 frames
[0005] From the summary of the above research content, we can see that the current Siamese method has the following problems: 1) The tracking method based on the shallow network can maintain a high tracking speed, but due to the characteristics of the network itself, the extracted features do not have strong discriminative power , so the target cannot be well positioned
2) The tracking method based on deep network can make full use of the ability of modern deep neural network, in every frame in a variety of complex scenes (illumination change, scale change, occlusion, deformation, motion blur, out-of-plane rotation, etc.) can estimate the target position more accurately, but the amount of parameters is huge, which requires a lot of computing overhead, and the tracking speed becomes very slow.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • High-speed twin network target tracking method based on positioning perception
  • High-speed twin network target tracking method based on positioning perception
  • High-speed twin network target tracking method based on positioning perception

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0068] The technical solutions in the embodiments of the present invention will be described clearly and in detail below with reference to the accompanying drawings in the embodiments of the present invention. The described embodiments are only some of the embodiments of the invention.

[0069] The technical scheme that the present invention solves the above-mentioned technical problems is:

[0070] as attached figure 1 As shown, a high-speed Siamese network target tracking method based on localization awareness includes the following steps:

[0071] 1. As attached figure 1 As shown, the template image and the search image are input into the feature extraction network to extract features:

[0072] 1) Intercept the image of each frame from the video, and crop it into 127×127 size as the template image (called z) and 255×255 size as the search image (called x);

[0073] 2) Input the template image z into the feature extraction network, and extract the template feature φ of t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a high-speed twin network target tracking method based on positioning perception, and belongs to the technical field of computer vision. The method mainly comprises the following steps: taking AlexNet as a feature extraction subnet, and carrying out a feature extraction task of a template image and a search image; in order to enhance the characterization capability of the features, the invention provides a context enhancement module which captures abundant target information from local and global levels, and simultaneously provides a new feature fusion strategy which fully combines the context information of different scale features; and finally, the regression loss of the network is calculated by using Distance-IoU loss, and a tracker is guided to select a more accurate bounding box. While a small number of parameters are increased, the method can guarantee a high tracking speed, and effectively improves the tracking performance of a twinning network tracker in a complex scene.

Description

technical field [0001] The invention belongs to the technical field of computer vision, in particular to a visual target tracking method. Background technique [0002] Object tracking is one of the most popular and challenging research topics in computer vision, which has attracted the attention of researchers from all over the world. Many countries have invested a lot of manpower, material resources, and financial resources to study it, and many excellent algorithms have emerged one after another. With the continuous updating of algorithms, the theory of target tracking has become more and more perfect, and it is currently widely used in video surveillance, intelligent human-computer interaction, automatic driving, robotics and modern military fields. However, given the target size and position of the initial frame of a video sequence, how can each frame in the subsequent complex scene (illumination change, scale change, occlusion, deformation, motion blur, out-of-plane ro...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06F16/583G06N3/08G06V10/75G06V10/40G06V10/26G06V10/82
CPCG06F16/583G06N3/08G06N3/045
Inventor 周丽芳丁相冷佳旭王懿王佩雯罗俊李佳其
Owner CHONGQING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products