Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Space-time diagram convolutional neural network and feature fusion-based human body action classification method

A convolutional neural network and feature fusion technology, which is applied in the field of human action classification, can solve problems such as difficult promotion of application programs, and achieve the effect of ensuring the accuracy of detection results, ensuring stability, and improving accuracy

Inactive Publication Date: 2020-07-31
NANJING UNIV OF POSTS & TELECOMM
View PDF0 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, most existing methods rely on hand-crafted parts or rules to analyze spatial patterns, so models designed for a specific application are difficult to generalize to other applications

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Space-time diagram convolutional neural network and feature fusion-based human body action classification method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032] Below in conjunction with accompanying drawing and specific embodiment, further illustrate the present invention, should be understood that these examples are only for illustrating the present invention and are not intended to limit the scope of the present invention, after having read the present invention, those skilled in the art will understand various aspects of the present invention All modifications of the valence form fall within the scope defined by the appended claims of the present application.

[0033] In the specific implementation, a human action classification method based on spatio-temporal graph convolutional neural network and feature fusion, the method first inputs a human skeleton key point information dataset preprocessed by pose estimation software, and obtains the sequence of skeleton key points ; Then select features with the same pattern for feature fusion; at the same time, use the coordinates of each human bone in each frame to represent the se...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a space-time diagram convolutional neural network and feature fusion-based human body action classification method. According to the invention, the method comprises the steps:constructing a space-time diagram of human body motion in a video in combination with a skeleton key point sequence of a human body, dividing sub-networks in time and space, and employing a graph convolutional neural network for training on the basis of the sub-networks. In addition, for the phenomenon of partial feature redundancy, a feature fusion method is introduced, and the accuracy of a model detection result is enhanced on the basis of an original model. According to the method, the problem of feature redundancy can be effectively avoided, and the accuracy and robustness of the model for human body action classification are improved.

Description

technical field [0001] The invention relates to a human action classification method based on a spatio-temporal graph convolutional neural network and feature fusion, and belongs to the technical field of computer vision human posture detection and recognition. Background technique [0002] Person posture detection and classification refers to a process of pattern recognition and classification of the movement of people in film and television videos. With the maturity of human posture detection systems such as Microsoft Kinect and OpenPose, the movement trajectory of key points of the human body provides a basis for the description of actions. Very good representation, the model based on skeleton key points can usually convey important feature information, so it is also becoming an important task in computer vision, especially in the research of human action recognition and classification. [0003] This task requires as input a sequence of human skeleton key points detected ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/46G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V40/23G06V10/462G06N3/045G06F18/241G06F18/253
Inventor 张懿扬陈志李玲娟张怡静赵彤彤岳文静
Owner NANJING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products