A time sequence behavior fragment generation system and method based on global context information

A context and behavior technology, applied in the field of video analysis, can solve problems such as not considering the importance of different unit behaviors, only encoding past information, and unable to obtain global context information, etc.

Active Publication Date: 2019-05-03
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF10 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0009] The technical problem to be solved by the present invention is to propose a system and method for generating time-series behavior fragments based on global context information, which solves the problem that traditional technologies cannot obtain global context information, can only encode past information, and do not consider the importance of behaviors of different units. The problem of taking average pooling

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A time sequence behavior fragment generation system and method based on global context information
  • A time sequence behavior fragment generation system and method based on global context information
  • A time sequence behavior fragment generation system and method based on global context information

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0051] The present invention aims to propose a time-series behavior segment generation system and method based on global context information, which solves the problem that the traditional technology cannot obtain global context information, can only encode past information, and does not consider the importance of the behavior of different units so as to directly adopt average pooling question. In the present invention, the shortcomings of existing methods that cannot obtain global context information and can only encode past information are solved through the bidirectional parallel LSTM module; in addition, the sequence behavior fragment reordering network based on behavior probability is important to the behavior of different video units It solves the defect that the existing method does not consider the behavioral importance of different units and directly adopts the average pooling.

[0052] The temporal behavior segment generation system based on global context information...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to the technical field of video analysis, discloses a global context information-based time sequence behavior fragment generation system and method, and solves the problems thatglobal context information cannot be obtained, previous information can only be coded, and the behavior importance of different units is not considered, so that average pooling is directly adopted inthe traditional technology. The system comprises a video unit coding network, a time sequence behavior fragment generation network and a time sequence behavior fragment reordering network based on behavior probability. Through the bidirectional parallel LSTM module in the time sequence behavior fragment generation network, global context information of a video is effectively utilized, and the defect that a time sequence convolution layer can only capture limited time sequence information and a unidirectional LSTM can only encode past information is overcome. The time sequence behavior fragmentreordering network based on the behavior probability weighs the importance of behaviors contained in different video units, so that the characteristics of the time sequence behavior fragments are efficiently fused. The method is suitable for behavior analysis and positioning in videos.

Description

technical field [0001] The invention relates to the technical field of video analysis, in particular to a system and method for generating time series behavior fragments based on global context information. Background technique [0002] Timing behavior segment generation means that given an undivided long video, the algorithm needs to detect the behavior segment in the video, including its start time and end time, so as to accurately locate the time period of the behavior in the long video and filter out irrelevant information Effect. [0003] Existing temporal behavior fragment generation methods can be divided into two categories: [0004] The first category uses temporal sliding windows to generate behavioral segments. [0005] The second type is to first divide the video into a collection of video units (a video unit is composed of several frames of pictures), then obtain the probability of each video unit containing behavior through the encoder, and finally gather the...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00H04N21/234H04N21/44
Inventor 宋井宽李涛高联丽
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products