Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human body action recognition method

A human motion recognition and motion technology, applied in character and pattern recognition, instruments, computer components, etc., can solve the problems of inflexible motion description, inability to receive sample labels, and reduce the effectiveness and differentiation of motion descriptions.

Active Publication Date: 2019-08-13
SUZHOU UNIV
View PDF8 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the traditional hard quantization method is difficult to measure and optimize the clustering results, and the features can only belong to a single cluster center, which makes the action description not flexible enough
In addition, the clustering and histogram quantization processes are separated into two stages, making this type of method not end-to-end, and the training process cannot be supervised by sample labels, which also reduces the effectiveness and discrimination of action descriptions.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body action recognition method
  • Human body action recognition method
  • Human body action recognition method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0123] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0124] Such as figure 1 As shown, a human action recognition method includes the following process:

[0125] 1. The total number of samples in the action sample set is 200. There are 10 action categories in total, and each action category has 20 samples. Three quarters of the samples in each action category are randomly selected as the training set, and the remaining quarter is divided into the test set, resulting in a total of 150 training samples and 50 tes...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a human body action recognition method. The method comprises the following steps of extracting movement of each skeleton joint point of an action sample between adjacent framesto serve as dynamic characteristics of the action sample; performing spatial multi-scale division on the dynamic features to obtain each sub-feature set; for each sub-feature set, enabling motion features of all skeleton joint points in the same frame to form vectors; extracting frame feature vectors of the sub-feature sets of all training samples, and performing clustering to obtain a clusteringcenter; inputting the feature vectors of all frames of the action sample into probability distribution neurons constructed by each sub-feature set, and accumulating all outputs on each probability distribution neuron to obtain histogram expression; performing time multi-scale division on the sub-feature set to obtain a time multi-scale histogram; forming a space-time multi-scale soft quantizationhistogram; forming a space-time multi-scale soft quantization network; and training the space-time multi-scale soft network, and inputting the test sample into the trained network model to realize action recognition.

Description

technical field [0001] The invention relates to a human body action recognition method, which belongs to the technical field of human body action recognition. Background technique [0002] Human action recognition is an important research direction in the field of machine vision, and has a wide range of applications in the fields of human-computer interaction, virtual reality, video retrieval, and security monitoring. With the development of depth cameras and human skeleton extraction algorithms, people can easily obtain information about human skeleton joints. Since the human body can be regarded as a system constructed by the interconnection of rigid skeletal joints, action recognition based on human skeletal joints has significant advantages over image-based action recognition. [0003] In recent years, many methods based on clustering and statistical models have been proposed for action recognition tasks. The codebook is obtained by clustering the features of all sampl...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/20G06F18/2321G06F18/23213G06F18/214
Inventor 杨剑宇黄瑶朱晨
Owner SUZHOU UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products