Human body action classification method based on fusion features

A technology of human action and classification method, which is applied in neural learning methods, instruments, biological neural network models, etc.

Active Publication Date: 2019-11-08
HUAIYIN INSTITUTE OF TECHNOLOGY
View PDF10 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This type of method needs to model the spatiotemporal relationship, but the spatiotemporal differenc

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body action classification method based on fusion features
  • Human body action classification method based on fusion features
  • Human body action classification method based on fusion features

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0086] Below in conjunction with the accompanying drawings and specific embodiments, based on the classification of 500 and 5 types of action videos collected, the present invention is further clarified. It should be understood that these embodiments are only used to illustrate the present invention and not to limit the scope of the present invention. After discovering the present invention, modifications to various equivalent forms of the present invention by those skilled in the art all fall within the scope defined by the appended claims of the present application.

[0087] Such as figure 1 As shown, a kind of action classification method based on fusion feature of the present invention comprises the following steps:

[0088] (1) Input multiple labeled human action videos, and convert each human action video into a frame sequence, such as figure 2 As shown, it specifically includes the following steps:

[0089] (101) input human action video training set AC, test set Tte...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a human body action classification method based on fusion features, and the method comprises the steps: inputting a plurality of human motion videos with labels, and convertingeach motion video into a frame sequence; predicting human body articulation point 3D coordinates of each frame by using a pre-trained hourglass human body posture estimation model to obtain an articulation point 3D coordinate data set; projecting the human body articulation point coordinates to a three-dimensional plane; respectively carrying out feature extraction on the projected data by usingtwo models of LSTM and GRU, and fusing the extracted two groups of feature vectors; and training a human body action video classification model based on the fused features, and inputting the video data into the trained human body action video classification model to obtain a human body action video classification result. According to the method, the overall features of the human body actions can be mined through feature fusion, the feature discrimination of the model is enhanced, and action classification with small inter-class difference is more accurate.

Description

technical field [0001] The invention belongs to the technical field of data fusion and action classification, and in particular relates to a human action classification method based on fusion features. Background technique [0002] In the field of data fusion and action classification technology, for the data collection problem of human action classification, the existing technology collects the joint point coordinates of people through somatosensory devices or 3D cameras, which requires manual marking of joint point positions, and the cost of data collection is relatively high; For the feature extraction of human actions, the existing technology mainly uses encoders, decoders, and deep neural networks to mine the spatiotemporal relationship between skeleton sequences. In this process, it is necessary to model the spatiotemporal relationship, but the spatiotemporal difference between similar actions is small. It is difficult to mine discriminative features; in view of the pr...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V40/20G06V20/41G06N3/045G06F18/253
Inventor 胡荣林董甜甜朱全银陈青云姚玉婷邵鹤帅施嘉婷谢静顾晨洁
Owner HUAIYIN INSTITUTE OF TECHNOLOGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products