Human body action recognition method

A human action recognition and action sequence technology, applied in the field of action recognition, can solve problems such as slow practical application, increased computational complexity of the learning model, difficulty in traversing the order of joint points, etc., and achieve the effect of reducing the impact of data noise

Pending Publication Date: 2020-10-30
SHANDONG UNIV
View PDF3 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The inventors found that at present, depth maps and human skeleton data are used for human behavior recognition, but there are still certain drawbacks. The traditional human behavior based on depth maps establishes depth map datasets from multiple perspectives and extracts a large number of features. The computational complexity of the learning m

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body action recognition method
  • Human body action recognition method
  • Human body action recognition method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0026] In a typical implementation of the present disclosure, such as figure 1 As shown, a human action recognition method is proposed.

[0027] Include the following steps:

[0028] Obtain the coordinate data of the joint points, and establish the distance feature, geometric feature and motion feature of the joint point set;

[0029] Model the spatio-temporal features of the action sequence from multiple perspectives, and use a one-dimensional temporal convolutional network to model the timing information of the action sequence;

[0030] Classify and recognize human actions through spatiotemporal features and time series information.

[0031] Specifically, the joint point-based human behavior feature representation includes joint point set distance feature representation, geometric feature representation and motion feature representation.

[0032] Treat combined distance features with joint points:

[0033] First calculate the distance between two joint points to get a sy...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a human body action recognition method, and relates to the field of action recognition. The method comprises the steps: obtaining feature data of articulation points, and establishing articulation point set distance features, geometrical features and motion features; modeling the spatial and temporal features of the action sequence from multiple angles, and modeling the timesequence information of the action sequence by adopting a one-dimensional time convolution network; through time-space characteristics and time sequence information, carrying out the human body action classification and identification; through designing a plurality of characteristic representations, describing the space and geometric information of different joint points in the same frame, describing the time motion information of joint points between adjacent frames, and better modeling the space-time characteristics of human body motion; meanwhile, employing a one-dimensional time convolution network for modeling time sequence information, carrying out the human body motion classification in combination with constructed characteristics, and obtaining an ideal classification and recognition effect.

Description

technical field [0001] The present disclosure relates to the field of action recognition, in particular to a human action recognition method. Background technique [0002] The statements in this section merely provide background information related to the present disclosure and may not necessarily constitute prior art. [0003] Human motion behavior recognition has important application value in elderly care, physical therapy rehabilitation, animation game production, security monitoring, factory man-machine collaboration, etc. [0004] The inventors found that at present, depth maps and human skeleton data are used for human behavior recognition, but there are still certain drawbacks. The traditional human behavior based on depth maps establishes depth map datasets from multiple perspectives and extracts a large number of features. The computational complexity of the learning model increases rapidly, resulting in slower practical application. In addition, due to the differ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V40/20G06N3/045G06F18/24
Inventor 刘国良李军伟张庆徽田国会刘甜甜
Owner SHANDONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products