Action feature representation method fusing rotation quantity

A rotation and motion technology, applied in the field of motion recognition, can solve the problems of mutual interference of similar motions, slow recognition accuracy and speed, and achieve the effect of avoiding mutual interference and improving the accuracy.

Pending Publication Date: 2019-11-19
XI'AN POLYTECHNIC UNIVERSITY
View PDF7 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] The purpose of the present invention is to provide an action feature representation method that combines rotation, which solves the mutual interference between similar action classe

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Action feature representation method fusing rotation quantity
  • Action feature representation method fusing rotation quantity
  • Action feature representation method fusing rotation quantity

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0037] The present invention will be described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0038] A motion feature representation method that integrates the amount of rotation, specifically according to the following steps:

[0039] Step 1: Action Feature Extraction,

[0040] Use the Microsoft Kinect2.0 infrared depth sensor to collect human skeleton information, including human skeleton information including the three-dimensional space coordinates of multiple skeleton points of the human body, and the topological structure information of the human body with SpineBase as the root node;

[0041] Step 2: The overall feature representation of the action,

[0042] According to the coordinates of human skeleton joint points in step 1, the human body posture matrix group is calculated, and the calculation formula of human body posture is:

[0043] R F =(R i,j ) M×M (1)

[0044] Among them, R F is the matrix of the human body pose...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an action feature representation method fused with rotation quantity. The action feature representation method specifically comprises the following steps that step 1, using a Microsoft Kinect2.0 infrared depth sensor for collecting human body skeleton information and human body topological structure information with SpineBase as a root node; step 2, calculating to obtain ahuman body posture matrix group according to the coordinates of the human body skeleton joint points in the step 1; step 3, according to the human body topology structure information and the coordinates of the human body skeleton joint points in the step 2, the rotation amount of the child skeleton joint points relative to the parent skeleton joint points is obtained through quaternion calculation, and 4, combinding the rotation amount in the step 3 with the human body posture matrix set in the step 2, and establishing an action feature representation method fusing the rotation amount. According to the action feature representation method fusing the rotation amount, interference between similar action classes can be effectively avoided, and the purpose of improving the accuracy of action recognition is achieved.

Description

technical field [0001] The invention belongs to the technical field of motion recognition, and in particular relates to a motion feature representation method that integrates rotation amounts. Background technique [0002] The development of computer image processing technology puts forward higher requirements for the accuracy of action recognition. Action recognition technology has been widely used in many aspects such as rehabilitation training, smart home and somatosensory games. With the rapid development of computer vision, more and more scholars are devoted to the related research of human action recognition. For action recognition, the extraction and representation of human action features are the premise and key, as well as the difficulty and focus. [0003] Action recognition is of great significance in human-computer interaction. Action recognition technology can enable computers to learn and understand human behavior and actions, improve user experience during ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06K9/62G06F3/01
CPCG06F3/011G06V40/23G06F18/253
Inventor 谷林王婧
Owner XI'AN POLYTECHNIC UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products