Target tracking method and system based on unmarked video training, terminal and medium

A target tracking and video technology, applied in the field of target tracking, can solve the problems of lack, no online update of training, no found instructions or reports, etc., to achieve the effects of enriching video data, reducing costs, and high robustness

Pending Publication Date: 2021-11-09
SHANGHAI JIAO TONG UNIV
View PDF0 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, there are some inherent defects in the UDT series of algorithms
First of all, when the UDT series uses cycle consistency for learning, the initial position of the tracking is random, so that UDT often starts from the background rather than the foreground object; and the lack of good initial frames also makes it difficult for the UDT series to learn a calibration frame regression model. thus limiting the upper performance limit of the UDT series
Second, when the UDT series algorithm is used for tracking learning, it can only learn circular tracking at short frame intervals based on the continuity of the video, so it is difficult to learn the large-scale movement and deformation of objects in long-term intervals.
Third, the UDT series did not focus on designing a deep network-based online tracking module
[0005] As mentioned above, the existing technology still has technical problems such as the inability to perform calibration frame regression, the inability to excavate the large deformation of moving objects, and the lack of training to update the tracking module online. At present, no description or report of similar technology to the present invention has been found, nor has it been collected domestically. similar information

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Target tracking method and system based on unmarked video training, terminal and medium
  • Target tracking method and system based on unmarked video training, terminal and medium
  • Target tracking method and system based on unmarked video training, terminal and medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0111] The following is a detailed description of the embodiments of the present invention: this embodiment is implemented on the premise of the technical solution of the present invention, and provides detailed implementation methods and specific operation processes. It should be noted that those skilled in the art can make several modifications and improvements without departing from the concept of the present invention, and these all belong to the protection scope of the present invention.

[0112] figure 1 It is a flowchart of an object tracking method based on unlabeled video training provided by an embodiment of the present invention.

[0113] Such as figure 1 As shown, the target tracking method based on unlabeled video training provided by this embodiment may include the following steps:

[0114] S100, performing unsupervised optical flow prediction on the original video, extracting a candidate frame of each frame in the original video, and obtaining a sequence of ca...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a target tracking method and system based on unmarked video training, and the method comprises the steps: carrying out the unsupervised optical flow prediction of an original video, extracting a candidate frame of each frame in the original video, and obtaining a candidate frame sequence; constructing a pseudo calibration frame sequence of a moving object in the original video based on the candidate frame sequence; constructing a training sample based on the pseudo calibration frame sequence, inputting the training sample into a naive twinning network to train the naive twinning network, and generating a preliminary tracking model; performing storage loop training on the preliminary tracking model to obtain a target tracking model; and tracking a target in a to-be-tracked video by using the target tracking model. Meanwhile, the invention provides a corresponding terminal and a medium. The cost of manual annotation of the video data is greatly reduced, and available video data is enriched and trained; and under the condition of no annotation, a target tracking model based on calibration frame regression is trained from an unannotated video.

Description

technical field [0001] The present invention relates to target tracking technology, in particular to a target tracking method, system, terminal and medium based on unlabeled video training. Background technique [0002] Object tracking is one of the most basic and important research directions in the field of computer vision. At present, target tracking technology has been widely used in intelligent warehouse management, live broadcast of sports events, unmanned aerial vehicles and other fields. Among them, the target tracking algorithm based on deep learning has attracted widespread attention from industry and academia due to its good generalization and high tracking accuracy. This type of deep learning-based target tracking algorithm is often trained based on supervised learning, which requires a large number of video datasets with accurate object location annotations as training samples. However, manual labeling of video information is time-consuming and laborious, and ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/246G06T7/187G06T3/40G06N3/04G06N3/08G06K9/62
CPCG06T7/246G06T7/187G06T3/4023G06N3/088G06T2207/10016G06T2207/20081G06T2207/30168G06N3/045G06F18/214
Inventor 马超郑继来杨小康
Owner SHANGHAI JIAO TONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products