Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Target tracking method based on space-time feature fusion learning

A space-time feature and target tracking technology, applied in the field of computer vision and pattern recognition, can solve the problems of target object deformation, occlusion tracking difficulty, loss, etc., and achieve the effect of improving speed and accuracy

Active Publication Date: 2019-05-24
SOUTHWEST JIAOTONG UNIV
View PDF8 Cites 22 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The purpose of the present invention is to provide a target tracking method based on spatio-temporal feature fusion learning, which can effectively solve the problem of tracking difficulty or even loss when the target object is deformed or occluded, and realize long-term real-time and accurate target tracking

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Target tracking method based on space-time feature fusion learning
  • Target tracking method based on space-time feature fusion learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] The method of the present invention can be used in various occasions of visual target tracking, including military and civilian fields. Transportation systems, human-computer interaction, virtual reality, etc.

[0030] Take the intelligent video surveillance of traction substation as an example: Intelligent video surveillance of traction substation includes many important automatic analysis tasks, such as intrusion detection, behavior analysis, abnormal alarm, etc., and these tasks must be able to achieve stable target tracking. It can be realized by using the tracking method proposed by the present invention. Specifically, it is first necessary to construct a spatio-temporal feature fusion learning neural network model, such as figure 1 As shown, the network is then trained using the training data set and the stochastic gradient descent method. Due to the mutual influence of the three networks, optimization is difficult, so the network training of spatio-temporal featu...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a target tracking method based on spatio-temporal feature fusion learning, and relates to the technical field of computer vision and pattern recognition. The method comprises the steps that firstly, a space-time feature fusion learning network is constructed, space-time features comprise time sequence features and space features, and the time sequence features are extractedin the mode that Alexnet and a time recurrent neural network are combined; Wherein the spatial features are divided into target object spatial transformation features and background spatial features,and YOLOv3 and Alexnet extraction is adopted respectively. In the initial training process of the network, a training data set and a random gradient descent method are used for training the space-time feature fusion learning network, and after training is completed, the network can obtain the initial capacity for positioning the target object. The image sequence to be tracked is input into the network for forward processing, the network outputs the position and confidence of the target object bounding box, the confidence decides whether the network performs online learning or not, and the position of the bounding box realizes positioning of the target object, so that tracking of the target object is realized.

Description

technical field [0001] The invention relates to the technical fields of computer vision and pattern recognition. Background technique [0002] Visual object tracking is an important research topic in the field of computer vision. Its research content is to automatically identify the target object to be tracked in the subsequent video sequence according to a given video clip, and obtain the continuous position, appearance and motion information of the target. Target tracking is widely used in military and civilian intelligent monitoring, human-computer interaction, automatic control systems and other fields, and has strong practical value. But in reality, the appearance of target objects is easily affected by factors such as deformation, occlusion, and illumination changes, making visual object tracking a very challenging problem. At present, target tracking methods mainly include classical target tracking methods and deep learning target tracking methods. [0003] Classica...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/46G06K9/62G06T7/246G06N3/04G06N3/08
Inventor 卢学民权伟刘跃平王晔张桂萍江永全何武陈锦雄
Owner SOUTHWEST JIAOTONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products