Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human skeleton action recognition method based on difference graph convolutional neural network

A convolutional neural network and action recognition technology, applied in the field of robot learning and computer vision, can solve the problems of low recognition accuracy, weak robustness, and large amount of calculation, and achieve high recognition accuracy, improved learning ability, and high recognition accuracy Effect

Active Publication Date: 2021-10-15
TONGJI UNIV
View PDF13 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The existing action recognition algorithms are divided into two categories: traditional manual feature extraction and deep learning algorithms. Manual feature extraction algorithms include space-time interest point method, motion histogram method, and dense trajectory method. The stickiness is weak, and it is greatly disturbed by light, occlusion, background change, etc. The deep learning algorithm is divided into CNN-based and RNN-based algorithms, among which the CNN-based algorithm is smaller than the RNN network model, and the calculation memory occupies less. It is a two-stream convolution method and a 3D convolution method. The graph convolution algorithm is a popular algorithm in recent years. It has the advantage of strong learning ability, but its calculation amount is relatively large, and it consumes more memory.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human skeleton action recognition method based on difference graph convolutional neural network
  • Human skeleton action recognition method based on difference graph convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032] The present invention will be described in detail below in conjunction with the accompanying drawings and specific embodiments. This embodiment is carried out on the premise of the technical solution of the present invention, and detailed implementation and specific operation process are given, but the protection scope of the present invention is not limited to the following embodiments.

[0033] This embodiment provides a human skeleton action recognition method based on a difference graph convolutional neural network. The training flow chart and network architecture diagram of the method are respectively figure 1 and figure 2 As shown, it specifically includes the following steps:

[0034] S1. Preprocess the bone data according to the general method, eliminate irrelevant bone data and incomplete repair data, and normalize the data of each dimension in the bone data to the [0,1] interval.

[0035] S2. Preliminarily design the difference graph convolutional neural ne...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a human skeleton action recognition method based on a difference graph convolutional neural network. The method comprises the following steps: S1, preprocessing skeleton data; S2, preliminarily designing a difference graph convolutional neural network architecture, combining graph convolution and convolution, adopting a difference learning mode, and taking errors of different layers of graph convolution as input feedforward of continuous time frames; S3, preliminarily selecting training parameters, and conducting back propagation on the errors; S4, carrying out cross-view and cross-object training and testing on a skeleton data set; S5, finely adjusting training parameters according to test precision, and repeating the steps S3 and S4 so as to obtain training parameters with high precision; and S6, fixing the training parameters, finely adjusting the network architecture so as to obtain network architecture parameters with high precision, and carrying out human skeleton action recognition. Compared with the prior art, when an NTU data set is tested, CV precision and CS precision are both improved, the number of the parameters is small, the total number of the parameters is smaller than 1 M, and rapid and accurate recognition effect is achieved.

Description

technical field [0001] The invention relates to the fields of robot learning and computer vision, in particular to a human skeleton action recognition method based on a difference graph convolutional neural network. Background technique [0002] As a branch of computer vision, action recognition has a wide range of applications in video surveillance, human-computer interaction, intelligent driving and other fields. Through the recognition and classification of actions, the purposes of video monitoring, understanding and action prediction can be achieved. In the field of video security monitoring, it can judge whether the production operation conforms to the safety production standard according to the human body movement; in the field of human-computer interaction, it can predict the next moment action according to the current human body movement, and provide input for the action decision-making of the robot; in the field of intelligent driving , smart cars can adopt differe...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/084G06N3/045G06F18/214
Inventor 刘成菊曾秦阳陈启军
Owner TONGJI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products