Behavior recognition method based on graph convolution and capsule neural network

A convolutional neural network and neural network technology, applied in neural learning methods, biological neural network models, character and pattern recognition, etc., can solve problems such as inability to draw inferences from one example, and the relative relationship between component orientation and space is not important

Active Publication Date: 2020-07-28
WUHAN UNIV
View PDF5 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] Capsule neural network: The traditional convolutional neural network has an important problem. The orientation of components and the relative relationship in space are not important to it. CNN only cares about whether there are features, and the pooling layer in CNN discards A lot of information, such as important location information; CNN can only recognize an object after feeding a large amount of data, and it has not reached the real sense of inference

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Behavior recognition method based on graph convolution and capsule neural network
  • Behavior recognition method based on graph convolution and capsule neural network
  • Behavior recognition method based on graph convolution and capsule neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0111] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention. In addition, the technical features involved in the various embodiments of the present invention described below can be combined with each other as long as they do not constitute a conflict with each other.

[0112] The present invention introduces the NTU RGB+D data set as a multi-frame human continuous action image;

[0113] Combine below Figure 1 to Figure 3 The specific embodiment of the present invention is introduced as a behavior recognition method based on graph convolution and capsule neural network, which specifically includes the following steps:

[0114] Step 1: Collect multiple frames...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a behavior recognition method based on graph convolution and a capsule neural network. The method comprises the following steps: obtaining space coordinates of a human body articulation point in each frame of a human body continuous action image through manual marking, and further constructing space coordinate vectors of the human body articulation point; mapping the space coordinate vectors into high-dimensional feature vectors through a multi-layer perceptron, and constructing an articulation point adjacency matrix in combination with an action association principle; constructing a speed space vector of the articulation point according to the space coordinates, and further constructing an acceleration space vector of the articulation point; using a convolutional neural network to extract features, using a capsule neural network for action classification, and constructing a capsule convolutional neural network through series connection of the convolutional neural network and the capsule neural network; and repeating multi-generation training on a training set to obtain a trained capsule convolutional neural network. The method conforms to the features of actual motion, propagation of the features on the graph better conforms to the actual situation, the features can be effectively reserved for classification, and the recognition capability of the model is improved.

Description

technical field [0001] The invention belongs to the research category of behavior recognition, and in particular relates to a behavior recognition method based on graph convolution and capsule neural network. Background technique [0002] Behavior recognition is an extremely challenging direction in the field of computer vision. It has important applications in many aspects, such as intelligent visual monitoring. In some places with high security requirements, such as supermarkets and banks, it can detect whether there is any suspicious behavior of people in real time. . Human-computer interaction, in the field of advanced user interfaces, it is hoped that future robots can understand human behavior and respond accordingly in order to communicate with humans, which can greatly improve human living standards. Based on content retrieval, in the era of information explosion, if you can label videos, it will help you accurately retrieve the desired videos in massive data. In s...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06N3/04G06N3/08
CPCG06N3/08G06V40/23G06N3/045
Inventor 蔡贤涛王森倪波
Owner WUHAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products