Action recognition method of self-adaptive mode

An action recognition and self-adaptive technology, applied in the field of action recognition, can solve the problem of low recognition accuracy, and achieve the effect of improving the accuracy and avoiding the impact.

Pending Publication Date: 2019-11-22
XI'AN POLYTECHNIC UNIVERSITY
View PDF4 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The purpose of the present invention is to provide an action recognition method in an ada

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Action recognition method of self-adaptive mode
  • Action recognition method of self-adaptive mode
  • Action recognition method of self-adaptive mode

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0065] This embodiment is carried out on the public UTKinect-Action dataset, and the results of the UTKinect-Action dataset are compared with the HO3DJ method and the CRF method.

[0066] The UTKinect-Action dataset contains the data of 10 actions extracted by the Kinect sensor, each action is completed by 10 people, each person repeats each action twice, and there are 199 valid action sequences in total. Actions in the UTKinect-Action dataset are characterized by high clustering and viewpoint variation.

[0067] The comparison results between this method and other methods are shown in Table 1. This method is 4.89% higher than the HO3DJ method and 4.09% higher than the CRF method.

[0068] Table 1 Comparison of recognition accuracy of each action in UTKinect-Action dataset

[0069]

Embodiment 2

[0071] Table 2 Action modes and recognition rates of each action in the MSR Action3D dataset

[0072]

[0073] This implementation is carried out on the MSR Action3D dataset, and the results of the MSR Action3D dataset are compared with HO3DJ, Profile HMM and Eigenjoints methods.

[0074] For the MSR Action3D dataset, the dataset contains 20 actions of 10 people, each action repeated 3 times, a total of 557 action sequences. The 20 actions in the dataset are divided into three subsets AS1, AS2 and AS3, as shown in Table 2, each subset has 8 actions. Among them, AS1 and AS2 are similar, and AS3 is relatively complex. Table 3 shows the comparison results between the method proposed in this paper and other methods on the MSRAction3D dataset. It can be seen from the experimental results that under the MSR Action3D dataset, the experimental results show that the action recognition rate is improved compared with the traditional Eigenjoints action recognition method. 9%, the act...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides an action recognition method of a self-adaptive mode, and the method comprises the steps: 1, collecting a human body motion video, extracting the position information of each frame of human body skeleton point, and forming a whole body skeleton point motion sequence; 2, reading a half-body skeleton point action sequence from the whole-body skeleton point action sequence, andperforming data processing to obtain a half-body action posture matrix sequence Sp and half-body action conformity s1, and the whole-body action posture matrix sequence Sh and the whole-body action conformity s2; step 3, if s1 is not greater than s2, taking Sh as input data; and otherwise, taking the Sp as input data, and performing action recognition on the input data by adopting an SVM supportvector machine to obtain a recognition result. According to the method, different action modes are adopted for recognition, the optimal skeleton point set is selected to represent the action posture characteristics, the influence of irrelevant joint points on action quality evaluation is effectively avoided, and the accuracy of action recognition is improved.

Description

technical field [0001] The invention belongs to the technical field of action recognition, and relates to an action recognition method in an adaptive mode. Background technique [0002] Action recognition technology has been widely used in many aspects such as rehabilitation training, smart home and somatosensory games. With the rapid development of computer vision, more and more scholars are devoted to the related research of human action recognition. For action recognition, the extraction and representation of human action features are the premise and key, as well as the difficulty and focus. The method of using a single action pattern for action recognition for all actions cannot reduce the error caused by the intra-class variability of the same action and the inter-class similarity between different actions, resulting in low action recognition accuracy. Contents of the invention [0003] The object of the present invention is to provide an action recognition method i...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/20G06V20/46G06F18/2411
Inventor 谷林王婧
Owner XI'AN POLYTECHNIC UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products