Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Graph convolutional neural network action recognition method based on attention mechanism

A convolutional neural network and action recognition technology, applied in the field of graph convolutional neural network action recognition based on attention mechanism, can solve the problems of ignoring skeleton time information, high computational complexity of graph learning, unable to capture the spatial relationship of nodes, etc. Achieve the effect of reducing information processing redundancy and high recognition rate

Pending Publication Date: 2021-07-16
ZHEJIANG SCI-TECH UNIV
View PDF0 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Tang et al. proposed a deep progressive reinforcement learning (DPRL) method to select more informative frames in the input sequence, and used GCN to learn the inter-joint dependencies, but it ignored the bone time information
Bin et al. proposed a spatiotemporal graph trajectory (STGR) method to study bone-based action recognition. This method learns spatiotemporal continuity, but its graph learning has high computational complexity, and the spatial graph is built on clusters, and each cluster is Only has a single weight, so it cannot capture the subtle spatial relationship between nodes

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Graph convolutional neural network action recognition method based on attention mechanism
  • Graph convolutional neural network action recognition method based on attention mechanism
  • Graph convolutional neural network action recognition method based on attention mechanism

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0057] The present invention will be further described below in conjunction with accompanying drawing.

[0058] In the present invention, the flow chart of the graph convolutional neural network action recognition method based on the attention mechanism is as attached figure 1 As shown, the implementation steps are as follows:

[0059] Step 1. Use the residual attention network to mark the N attention joints with the highest action participation. N can be 16, or other values ​​can be set according to the actual situation:

[0060] A residual attention network is used to extract attention joints in 3D skeleton information. The core part of the residual attention network is a multi-layer overlay attention module. Each attention module includes a mask branch and a trunk branch. The main branch performs feature processing and can use any network model. The residual attention network takes as input the raw RGB image corresponding to the skeleton information to generate attentio...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a graph convolutional neural network action recognition method based on an attention mechanism, and relates to the field of human-computer interaction action recognition. The method comprises the steps of labelling N attention joints with the highest motion completion participation degree by a residual attention network, N can be 16, and setting other numerical values according to the actual situation; constructing a three-dimensional skeleton space-time diagram, and carrying out space-time feature coding on the attention joints; and learning the three-dimensional skeleton space-time graph through a graph convolutional neural network GCN to perform action recognition. The joints with high participation degree for completing the specific action are selected based on the residual attention network, so that the information processing redundancy can be reduced, and joint information which does not contribute to action recognition is abandoned; based on space-time constraints between the joints, space-time feature codes about the attention joints are constructed to more effectively represent space-time features of the attention joints; based on human body space structure natural graph representation, a graph convolutional neural network is utilized to obtain depth representation about a three-dimensional skeleton space-time graph so as to effectively recognize actions.

Description

technical field [0001] The invention relates to the field of human-computer interaction action recognition, in particular to an attention mechanism-based graph convolutional neural network action recognition method. Background technique [0002] As one of the important ways of non-contact interaction to replace traditional contact human-computer interaction methods such as keyboards, buttons, and touch screens, vision-based action recognition has become a research hotspot in the field of computer vision. Although people have studied vision-based action recognition for many years, it is still unable to be applied on a large scale, mainly facing the following technical difficulties: factors such as illumination changes and cluttered backgrounds make the action recognition method less robust; depth image information redundancy The large degree increases the computational complexity of the algorithm, which in turn limits the application of action recognition methods; the origina...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06F17/16G06N3/04G06N3/08
CPCG06F17/16G06N3/08G06V40/23G06N3/045
Inventor 王洪雁张鼎卓袁海周贺
Owner ZHEJIANG SCI-TECH UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products