A Human Action Classification Method Based on Pose Recognition

A gesture recognition and human action technology, applied in the field of image processing, can solve the problems of not being able to accurately represent colorful human poses, ignoring details, etc., to achieve the effect of diverse arm movements and high classification accuracy

Active Publication Date: 2019-04-19
NANJING UNIV OF POSTS & TELECOMM
View PDF5 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In terms of feature expression based on visual capture technology, human body contours were initially used as pose feature expressions. However, contour features describe poses from an overall perspective, ignoring the details of each part of the body, and cannot accurately represent rich and colorful human poses.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Human Action Classification Method Based on Pose Recognition
  • A Human Action Classification Method Based on Pose Recognition
  • A Human Action Classification Method Based on Pose Recognition

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] Below in conjunction with accompanying drawing, technical scheme of the present invention is described in further detail:

[0030] Human body action classification based on gesture recognition, firstly, perform human body upper body posture recognition on the collected human body motion pictures in the database to obtain the 'stickman model' (that is, skeleton features), and then use multi-classification SVM to train the obtained skeleton features, Get a classifier that can classify different actions, and use the trained classifier to classify different actions of the human body. Specifically:

[0031] 1. Human motion posture recognition

[0032] 1.1 Graphic structure model

[0033] The invention utilizes Pictorial structures to estimate the human body appearance model, and then performs gesture recognition on the obtained human body structure model. The specific implementation steps include detecting the position of the human body, highlighting the foreground, and a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a human body action classification method based on gesture recognition, which includes the following steps: step 1, perform gesture recognition on the upper body of the human body, and obtain skeleton features that can represent the position, direction and size of each part of the upper body of the human body; step 2, Normalize the data in the skeleton features obtained in step 1; step 3, use multi-classification SVM to train the normalized skeleton features, and obtain a classifier that can classify different actions; step 4, use The classifier trained in step 3 classifies the input action. The collected human motion pictures are used as test data for experiments, and the experimental results show that the classification accuracy rate of the present invention reaches 97.78%, which can classify human motions well.

Description

technical field [0001] The invention relates to the technical field of image processing, in particular to a gesture recognition-based human action classification method. Background technique [0002] The rapid development of computer network technology and multimedia technology has created convenient conditions for the storage and transmission of massive visual information such as images, and people can obtain a large amount of picture information from the Internet. However, the increasing amount of data also makes it difficult for people to find the pictures they want. For the website, it is necessary to manage this large amount of picture information, classify the pictures, and build an index, so that users can easily obtain the required content. For the majority of users, it is also hoped that they can find the picture information they need quickly and effectively, so as to reduce unnecessary waste of time. Therefore, classifying pictures has important practical signifi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00G06K9/34G06K9/46G06K9/62
CPCG06V40/23G06V10/267G06V10/40G06F18/2411G06F18/214
Inventor 葛军庾晶郭林
Owner NANJING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products