A Human Action Classification Method Based on Fusion Features

A technology of human action and classification methods, applied in neural learning methods, instruments, biological neural network models, etc., can solve problems such as difficulty in extracting distinguishing features, small temporal and spatial differences between classes, etc., to improve effectiveness and classification accuracy. , The effect of simplifying the data collection process

Active Publication Date: 2022-02-11
HUAIYIN INSTITUTE OF TECHNOLOGY
View PDF10 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This type of method needs to model the spatiotemporal relationship, but the spatiotemporal difference between classes of similar actions is small, and it is difficult to extract distinguishing features

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Human Action Classification Method Based on Fusion Features
  • A Human Action Classification Method Based on Fusion Features
  • A Human Action Classification Method Based on Fusion Features

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0086] Below in conjunction with the accompanying drawings and specific embodiments, based on the classification of 500 and 5 types of action videos collected, the present invention is further clarified. It should be understood that these embodiments are only used to illustrate the present invention and not to limit the scope of the present invention. After discovering the present invention, modifications to various equivalent forms of the present invention by those skilled in the art all fall within the scope defined by the appended claims of the present application.

[0087] Such as figure 1 As shown, a kind of action classification method based on fusion feature of the present invention comprises the following steps:

[0088] (1) Input multiple labeled human action videos, and convert each human action video into a frame sequence, such as figure 2 As shown, it specifically includes the following steps:

[0089] (101) input human action video training set AC, test set Tte...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a human action classification method based on fusion features, which includes inputting a plurality of labeled human action videos, converting each action video into a frame sequence; using a pre-trained hourglass human body pose estimation model to predict the human body joints of each frame Point 3D coordinates to obtain the 3D coordinate data set of joint points; project the coordinates of human joint points to a three-dimensional plane; use LSTM and GRU models to extract features from the projected data, and fuse the extracted two sets of feature vectors; The fused features train the human action video classification model, input the video data into the trained human action video classification model, and obtain the human action video classification result. The method of the invention can mine the overall features of human actions through feature fusion, enhance the model's ability to distinguish features, and classify actions with small differences between classes more accurately.

Description

technical field [0001] The invention belongs to the technical field of data fusion and action classification, and in particular relates to a human action classification method based on fusion features. Background technique [0002] In the field of data fusion and action classification technology, for the data collection problem of human action classification, the existing technology collects the joint point coordinates of people through somatosensory devices or 3D cameras, which requires manual marking of joint point positions, and the cost of data collection is relatively high; For the feature extraction of human actions, the existing technology mainly uses encoders, decoders, and deep neural networks to mine the spatiotemporal relationship between skeleton sequences. In this process, it is necessary to model the spatiotemporal relationship, but the spatiotemporal difference between similar actions is small. It is difficult to mine discriminative features; in view of the pr...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06V40/20G06V10/82G06V20/40G06V10/80G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V40/20G06V20/41G06N3/045G06F18/253
Inventor 胡荣林董甜甜朱全银陈青云姚玉婷邵鹤帅施嘉婷谢静顾晨洁
Owner HUAIYIN INSTITUTE OF TECHNOLOGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products