Human body behavior recognition system based on graph convolutional neural network

A convolutional neural network and recognition system technology, applied in the field of human behavior recognition, can solve the problems of deep feature extraction of CNN and loss of key information, and achieve the effect of improving accuracy.

Pending Publication Date: 2022-01-21
HARBIN UNIV OF SCI & TECH
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Since the skeleton information is represented by the three-dimensional coordinate points of several joints, it is impossible to directly use the traditional CNN for deep feature extraction. The existing CNN method generally converts the skeleton data into an image, and maps the spatial and temporal information on the image texture, Color and other attributes, and then feature extraction, but this process will cause the loss of some key information

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body behavior recognition system based on graph convolutional neural network
  • Human body behavior recognition system based on graph convolutional neural network
  • Human body behavior recognition system based on graph convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0017] In order to make the purpose, technical solutions and advantages of the present invention clearer, the present invention will be described below through specific embodiments shown in the accompanying drawings. It should be understood, however, that these descriptions are illustrative only and are not intended to limit the scope of the invention. The structures, proportions, sizes, etc. shown in the drawings of this specification are only used to cooperate with the content disclosed in the specification, for those who are familiar with this technology to understand and read, and are not used to limit the conditions for the implementation of the present invention, so Without technical substantive significance, any modification of the structure, change of the proportional relationship or adjustment of the size shall still fall within the technology disclosed in the present invention without affecting the functions and objectives of the present invention. within the scope o...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a human body behavior recognition system based on a graph convolutional neural network, and relates to the technical field of human body behavior recognition. The method comprises the following steps: 1, constructing an undirected space-time skeleton diagram for human joint data collected by a depth sensor, and taking the undirected space-time skeleton diagram as an input signal of space-time diagram convolution; 2, inputting the constructed skeleton graph into a space-time graph convolutional network for action feature extraction to realize human body action recognition; 3, embedding the gating unit recurrent neural network into space-time diagram convolution to optimize the network, and better realizing synchronous extraction of space domain features and time domain features; 4, realizing man-machine interaction in a virtual environment by utilizing 3D modeling software. According to the method, human skeleton action information collected in an NTU RGB + D data set is utilized, a space-time skeleton graph network structure is constructed for time sequence representation of human skeleton joint point positions and a space cooperation relation, and end-to-end human skeleton action recognition based on a space-time diagram convolutional neural network is achieved.

Description

technical field [0001] The invention belongs to the technical field of human behavior recognition, in particular to a human behavior recognition system based on a graph convolutional neural network. Background technique [0002] Human action recognition covers many research topics in computer vision, including human detection in videos, pose estimation and tracking, and analysis and understanding of action sequences. Human action recognition is an important research area due to its wide range of applications, such as patient monitoring, motion analysis, intelligent video surveillance, and human-computer interaction. Traditional human action recognition is mainly based on RGB video, but RGB video has disadvantages such as large amount of calculation, easy to be affected by lighting conditions, and sensitive to background noise. Thanks to the maturity of human body detection algorithms, more and more researches focus on bone-based human action recognition. The human body can...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06V40/20G06V10/44G06V10/82G06N3/04G06N3/08
CPCG06N3/08G06N3/047G06N3/044G06N3/045
Inventor 张锐张梦珂
Owner HARBIN UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products