Video tracking system and method based on target feature space-time alignment

A target feature and video tracking technology, which is applied in the field of video tracking, can solve the problem of not considering the mismatch or mismatch of the front and rear frames, and achieve the effect of good discrimination

Pending Publication Date: 2022-05-13
SHANGHAI JIAO TONG UNIV
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This type of method does not take into account the differences between different...

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Video tracking system and method based on target feature space-time alignment
  • Video tracking system and method based on target feature space-time alignment
  • Video tracking system and method based on target feature space-time alignment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0021] Such as figure 1 and figure 2 As shown, a video tracking system based on temporal and spatial alignment of target features involved in this embodiment includes: a global feature extraction module, a target position prediction module, a target feature extraction module, and a target tracking module.

[0022] Such as image 3 As shown, the global feature extraction module includes: a feature extraction network and an adjacent frame similarity calculation unit, wherein: the feature extraction network generates the corresponding original frame and the feature map of the reference frame according to the original frame and the reference frame of the video to be tested , and its size after downsampling is (C, H, W). The adjacent frame similarity between the two feature maps calculated by the adjacent frame similarity calculation unit.

[0023] The similarity calculation of the adjacent frame feature map adopts the method of spatial correlation, that is, each pixel on the c...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a video tracking system and method based on target feature space-time alignment, and the system comprises a global feature extraction module, a target position prediction module, a target feature extraction module and a target tracking module.The system makes full use of the position relation between targets and the motion features of front and back frame targets, enhances the discrimination between different targets, and improves the accuracy of target tracking. The most representative target features are screened out, the targets can be predicted and distinguished more accurately during video tracking, front and back frame targets can be matched more accurately, and the stability of object category prediction is enhanced.

Description

technical field [0001] The present invention relates to a technology in the field of video tracking, in particular to a video tracking system and method based on temporal and spatial alignment of target features. Background technique [0002] Video tracking, that is, to locate multiple objects of interest in the video, and at the same time number all the objects according to their characteristics, and record continuous motion trajectories. [0003] Existing video tracking methods include algorithms based on the tracking-by-detection framework, such as the deepsort algorithm, which detects objects in each frame and correlates the objects detected in the previous and subsequent frames to obtain a series of trajectories. This method only combines the ordinary association and allocation algorithm with the target detector. The tracking effect depends on the performance of the target detection. When the target is located in a dense scene and the target or the camera moves rapidly,...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T7/246G06T7/73G06N3/04G06N3/08
CPCG06T7/248G06T7/74G06N3/08G06T2207/10016G06T2207/30241G06N3/048
Inventor 林巍峣彭嘉淇
Owner SHANGHAI JIAO TONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products