A motion classification method based on simple loop network and attention mechanism of joint space-time is proposed

A simple cycle and classification method technology, applied in the field of pattern recognition, can solve the problems of long training time, timing-dependent calculation, and more time-consuming, etc., and achieve the effect of improving accuracy and accuracy

Active Publication Date: 2019-02-22
HANGZHOU DIANZI UNIV
View PDF11 Cites 17 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The above methods have made good use of various recurrent network models to conduct research on behavior recognition based on human joints, but methods based on LSTM and GRU have a large number of calculations that depend on time series when processing sequences of human joints, and these calculations cannot finish independently
For example, when LSTM and GRU calculate the hidden state of the current time step, due to the dependence, they must first calculate the hidden state of the previous time step, which limits the speed of sequence processing. With the scale of the LSTM model and the number of hyperparameters The increase of the training time is getting longer and longer, and the adjustment of the parameters will also take more time

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A motion classification method based on simple loop network and attention mechanism of joint space-time is proposed
  • A motion classification method based on simple loop network and attention mechanism of joint space-time is proposed
  • A motion classification method based on simple loop network and attention mechanism of joint space-time is proposed

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0018] The following describes in detail the action classification method based on joint spatio-temporal simple loop network and attention mechanism of the present invention in conjunction with the accompanying drawings. figure 1 for the implementation flow chart.

[0019] Such as figure 1 , the implementation of the method of the present invention mainly comprises three steps: (1) extract feature from the articulation point data that represents action with deep learning method; (2) input the feature that extracts in step (1) to two-layer ST-SRU model (3) Use the output of the ST-SRU in step (2) to update the state of the global context memory unit, and play a gating role in the inflow of information into the ST-SRU of the second layer in step (2). , when the iterative update process of the attention model ends, the final classification result is obtained.

[0020] Each step will be described in detail below one by one.

[0021] Step 1: Extract features from joint point dat...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to an action classification method based on a connection node spatio-temporal simple circulating network and an attention mechanism. Firstly, the space dimension of SRU model isextended, and an iterative computation method of ST-SRU model. Then, in the proposed ST-SRU model and global contextual attention mechanism, GCA-ST-SRU method. Finally, the proposed method is appliedto human behavior recognition. Firstly, the features of human joints are extracted by depth network, and then the human joints are extracted by GCA-ST-SRU method recognizes the extracted features. Themethod of the invention can reduce training time consumption and improve classification accuracy, and has obvious efficiency advantages. The method of the invention can quickly deduce the speed, which is favorable for the design of the real-time motion recognition system, and is suitable for running on the platform with limited computing power, and has wide application prospect in the fields of computer vision, intelligent monitoring, human-computer interaction and the like.

Description

technical field [0001] The invention belongs to the field of pattern recognition, and is a method for modeling actions represented by joint points by using a space-time simple loop network, and combining the advantages of an attention mechanism to classify actions. Background technique [0002] Action recognition is widely used in intelligent video surveillance, human-computer interaction, medical assistance, abnormal behavior detection and other fields. Action recognition refers to the classification of the behavior of characters in a video. A video can be decomposed into multiple consecutive pictures, so action recognition can also be regarded as a problem of classifying a sequence of pictures. A popular research direction of action recognition in recent years is to capture the three-dimensional space coordinates of body joint points when people are doing actions through sensors such as depth cameras, and then classify the sequence of human joint point coordinates. The coo...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/46G06N3/08
CPCG06N3/08G06N3/084G06V40/20G06V10/462
Inventor 佘青山穆高原
Owner HANGZHOU DIANZI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products