A Weakly Supervised Temporal Action Localization Method Based on Action Coherence

A positioning method and coherence technology, applied in the field of computer vision, can solve problems such as ignoring differences, ignoring RGB and optical flow features, and inaccurate time labeling, so as to avoid limitations

Active Publication Date: 2021-08-13
XI AN JIAOTONG UNIV
View PDF8 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Most of the current time-series action positioning methods require precise time marking, which consumes a lot of manpower and material resources; at the same time, the time marking may be inaccurate due to the ambiguity of the action boundary
In addition, in the current temporal action localization method, RGB and optical flow are not processed separately, and the characteristics of RGB and optical flow are ignored; the final segment score is only obtained by classification score, ignoring the difference between RGB and optical flow itself, and the classification neural The network dependence is large, and it is difficult to obtain the optimal result

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Weakly Supervised Temporal Action Localization Method Based on Action Coherence
  • A Weakly Supervised Temporal Action Localization Method Based on Action Coherence
  • A Weakly Supervised Temporal Action Localization Method Based on Action Coherence

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0069] see figure 1 , a weakly supervised time-series action location method based on action coherence in an embodiment of the present invention, specifically comprising the following steps:

[0070] Step 1: Perform the following processing on RGB and optical flow respectively: Divide the video into a collection of 15 frames that do not overlap. For each segment, randomly select 3 frames as the representative frame of the segment, and then use Temporal SegmentNetwork for the 3 frames Extract the features and take the average value as the feature of the segment.

[0071] Step 2: Taking RGB as an example (optical flow is the same as RGB processing method), the RGB feature R obtained in step 1 s Input to multiple regression networks. Each regression network consists of a 3-layer 1D convolutional neural network and is assigned a segment length P. To avoid overfitting, the first two layers of the regression network consist of dilated convolutional networks with 256 kernels of si...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention belongs to the field of machine vision, and discloses a weakly supervised time-series action positioning method based on action coherence, including: dividing the video into RGB frames and processing the optical flow separately; At each time point, hypothetical action segments of different lengths are proposed, and then the action segments are regressed using a convolutional neural network based on the action coherence and classification accuracy of the video. For the different action segments obtained by the two action modalities, a characteristic module is used to combine them to filter out the final action positioning results. In the case of a given video category, the present invention can locate action segments belonging to the category in the video.

Description

technical field [0001] The invention belongs to the technical field of computer vision, relates to a weakly supervised temporal sequence action positioning method, in particular to a weakly supervised temporal sequence action positioning method based on action coherence. Background technique [0002] Temporal action localization is an important computer vision problem, and it has very important applications in abstract video understanding tasks, such as event detection, video summarization, and video question answering. [0003] Most of the current time-series action positioning methods require precise time marking, which consumes a lot of manpower and material resources; at the same time, the time marking may be inaccurate due to the ambiguity of the action boundary. In addition, in the current temporal action localization method, RGB and optical flow are not processed separately, and the characteristics of RGB and optical flow are ignored; the final segment score is only o...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00G06N3/04
CPCG06V40/20G06V20/40G06N3/045
Inventor 王乐翟元浩刘子熠
Owner XI AN JIAOTONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products