Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-target tracking method based on LSTM network and deep reinforcement learning

A multi-target tracking and reinforcement learning technology, applied in the field of multi-target tracking based on LSTM network and deep reinforcement learning, can solve the problems of insufficient model and inaccurate tracking results, so as to overcome the lack of comprehensive model, improve the accuracy and many more. Effect of Target Tracking Accuracy

Active Publication Date: 2018-09-25
HUAIYIN INSTITUTE OF TECHNOLOGY
View PDF2 Cites 17 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Purpose of the invention: In order to overcome the technical shortcomings that the artificially designed model is not comprehensive enough and the tracking result is not accurate enough in the prior art, the present invention provides a multi-target tracking method based on LSTM network and deep reinforcement learning

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-target tracking method based on LSTM network and deep reinforcement learning
  • Multi-target tracking method based on LSTM network and deep reinforcement learning
  • Multi-target tracking method based on LSTM network and deep reinforcement learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0025] The present invention will be further described below in conjunction with the accompanying drawings and specific embodiments.

[0026] Such as figure 1 As shown, the multi-target tracking method based on LSTM network and deep reinforcement learning includes the following steps:

[0027] (1) Use YOLO V2 target detector to detect each frame of image in the video to be tested, output the detection result, set the detection result of the tth frame image as a set is the jth detection result of the tth frame image, and N is the total number of detections;

[0028] (2) if figure 2 As shown, multiple single-target trackers based on deep reinforcement learning technology are constructed, each single-target tracker includes a convolutional neural network CNN and a fully connected layer FC, and the convolutional neural network is built on the basis of the VGG-16 network , VGG-16 belongs to the state-of-the-art and has wide applications in deep learning methods. The CNN net...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-target tracking method based on an LSTM network and deep reinforcement learning. A target detector is used to detect each frame in a video to be detected, and a detection result qt<j> is output; a number of single-objective trackers based on a deep reinforcement learning technology are constructed, wherein each single-target tracker comprises a convolutional neuralnetwork and a fully connected layer and the convolutional neural network is constructed on the basis of a VGG-16 network; the tracking result pt of each single-target tracker is output; a similarity matrix, which is described in the description, of data association is calculated; a data association module is constructed based on the LSTM network; the similarity matrix is input to acquire a distribution probability vector At; At<ij> is the matching probability between the i-th target and a detection result j; and an acquired target detection result with the maximum matching probability isthe tracking result of the i-th target. The method provided by the invention is not affected by mutual occlusion, similar appearance and continuous quantity change in a multi-target tracking process,and improves the multi-target tracking accuracy and the multi-target tracking precision.

Description

technical field [0001] The invention belongs to the field of computer vision and relates to a video multi-target tracking method, in particular to a multi-target tracking method based on LSTM network and deep reinforcement learning. Background technique [0002] Multi-target tracking is a hot issue in the field of computer vision and plays an important role in many application fields, such as: artificial intelligence, virtual reality, unmanned driving, etc. Despite a large number of related works in the early stage, multi-target tracking is still a challenging problem due to frequent occlusions, similar appearance of multiple targets, and constantly changing number of targets in the multi-target tracking process. [0003] In recent years, detection-based multi-target tracking methods have achieved some success. They divide multi-target tracking into two parts: multi-target detection and data association. The detection-based multi-target tracking method can solve the problem ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/20G06N3/04
CPCG06T7/20G06T2207/20084G06T2207/10016G06N3/045
Inventor 姜明新常波贾银洁
Owner HUAIYIN INSTITUTE OF TECHNOLOGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products