Deep learning-based target tracking method, device and storage medium

A target tracking and deep learning technology, applied in devices and storage media, in the field of target tracking methods based on deep learning, can solve problems such as poor real-time performance, difficult to meet practical applications, etc., achieve high tracking speed, improve target tracking accuracy, target tracking High coincidence effect

Active Publication Date: 2017-10-20
JILIN UNIV
View PDF3 Cites 41 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, due to the huge amount of calculation, the algorithm of target tracking using deep learning is often relatively slow, and the real-time performance is poor, which is difficult to meet the requirements of practical applications.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Deep learning-based target tracking method, device and storage medium
  • Deep learning-based target tracking method, device and storage medium
  • Deep learning-based target tracking method, device and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0092] In this embodiment, a comparison example of using the method of the present invention and other target tracking methods is shown.

[0093] At present, most of the algorithms that use deep learning methods to study object tracking problems are relatively slow, and the fastest is the general object tracking algorithm based on regression network GOTURN (Generic Object Tracking Using Regression Networks) proposed in 2016. In order to evaluate the performance of the algorithm of the present invention more accurately and objectively, the present invention designs multiple groups of comparative experiments and GOTURN algorithm to compare, and evaluates the performance of the accuracy, real-time and robustness of the target tracking algorithm in three aspects: using Tracking accuracy and coincidence are used to quantify tracking accuracy, and tracking speed is used to quantify real-time performance. For the evaluation of robustness, this experiment conducts qualitative analysis....

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a deep learning-based target tracking method, a deep learning-based target tracking device and a storage medium. The method includes the following steps that: two frames of pictures are continuously read; a target region of the former frame and a search region of the current frame are set and cut, when the search region of the current frame is set and cut, whether an object is stable when the object moves fast is judged, so that a center point position is set, and the search region can be obtained; the target region and the search region are inputted to a convolutional neural network, calculation is performed, so that the target region of the current frame can be obtained; calculation is performed, the inter-frame displacement of the current frame relative to the former frame is obtained; and whether the current frame is the final frame is judged, so that whether iterative target tracking is performed further is judged. According to the deep learning-based target tracking method, the deep learning-based target tracking device and the storage medium of the present invention, the prediction of the center point position of the cutting region of the current frame can be realized through judging the speed of the movement of the target object in the image. Compared with an existing algorithm, the method can improve target tracking accuracy and target coincidence degree with original high tracking speed maintained, and has good tracking robustness.

Description

technical field [0001] The present invention relates to the field of image processing, in particular, to a method, device and storage medium for object tracking based on deep learning in image processing. Background technique [0002] Object tracking is a challenging research topic in the field of computer vision, and it has become a research hotspot because of its wide application in many fields such as security, transportation, military, virtual reality, and medical imaging. The purpose of target tracking is to determine the continuous position of the target object in an orderly image sequence for further analysis and processing, so as to realize the analysis and understanding of the target object's motion behavior. Since entering the 21st century, with the rapid development of information technology, the computing performance of computers and the quality of image acquisition equipment such as cameras are gradually improving. In addition, people pay more and more attention...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/246
CPCG06T7/248G06T2207/10016G06T2207/20021G06T2207/20081G06T2207/20084G06T2207/30224G06T2207/30236
Inventor 王欣石祥文
Owner JILIN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products