Action Classification Method Based on Joint Spatiotemporal Simple Recurrent Network and Attention Mechanism

A simple loop, classification method technology, applied in the field of pattern recognition, can solve the problems of limited sequence processing speed, long training time, more time consumption, etc., to overcome the slow calculation speed and improve the accuracy.

Active Publication Date: 2022-01-18
HANGZHOU DIANZI UNIV
View PDF11 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The above methods have made good use of various recurrent network models to conduct research on behavior recognition based on human joints, but methods based on LSTM and GRU have a large number of calculations that depend on time series when processing sequences of human joints, and these calculations cannot finish independently
For example, when LSTM and GRU calculate the hidden state of the current time step, due to the dependence, they must first calculate the hidden state of the previous time step, which limits the speed of sequence processing. With the scale of the LSTM model and the number of hyperparameters The increase of the training time is getting longer and longer, and the adjustment of the parameters will also take more time

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Action Classification Method Based on Joint Spatiotemporal Simple Recurrent Network and Attention Mechanism
  • Action Classification Method Based on Joint Spatiotemporal Simple Recurrent Network and Attention Mechanism
  • Action Classification Method Based on Joint Spatiotemporal Simple Recurrent Network and Attention Mechanism

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0018] The following describes in detail the action classification method based on joint spatio-temporal simple loop network and attention mechanism of the present invention in conjunction with the accompanying drawings. figure 1 for the implementation flow chart.

[0019] Such as figure 1 , the implementation of the method of the present invention mainly comprises three steps: (1) extract feature from the articulation point data that represents action with deep learning method; (2) input the feature that extracts in step (1) to two-layer ST-SRU model (3) Use the output of the ST-SRU in step (2) to update the state of the global context memory unit, and play a gating role in the inflow of information into the ST-SRU of the second layer in step (2). , when the iterative update process of the attention model ends, the final classification result is obtained.

[0020] Each step will be described in detail below one by one.

[0021] Step 1: Extract features from joint point dat...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to an action classification method based on joint point space-time simple loop network and attention mechanism. First, the ordinary SRU model is extended to the space dimension, and a ST-SRU model that performs iterative calculations in both time and space dimensions is designed. Then, based on the proposed ST‑SRU model, the global context attention mechanism is introduced, and the GCA‑ST‑SRU method is proposed. Finally, the proposed method is used for human behavior recognition. First, the features of human joints are extracted by deep network, and then the extracted features are recognized by GCA‑ST‑SRU method. The method of the invention can reduce training time consumption and improve classification accuracy, and has obvious efficiency advantages. The rapid inference speed of the method of the invention is beneficial to the design of a real-time action recognition system, is suitable for running on a platform with limited computing power, and has broad application prospects in the fields of computer vision, intelligent monitoring, human-computer interaction and the like.

Description

technical field [0001] The invention belongs to the field of pattern recognition, and is a method for modeling actions represented by joint points by using a space-time simple loop network, and combining the advantages of an attention mechanism to classify actions. Background technique [0002] Action recognition is widely used in intelligent video surveillance, human-computer interaction, medical assistance, abnormal behavior detection and other fields. Action recognition refers to the classification of the behavior of characters in a video. A video can be decomposed into multiple consecutive pictures, so action recognition can also be regarded as a problem of classifying a sequence of pictures. A popular research direction of action recognition in recent years is to capture the three-dimensional space coordinates of body joint points when people are doing actions through sensors such as depth cameras, and then classify the sequence of human joint point coordinates. The coo...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00G06N3/08
CPCG06N3/08G06N3/084G06V40/20G06V10/462
Inventor 佘青山穆高原
Owner HANGZHOU DIANZI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products