Motion characteristics extraction method and device

A feature extraction and action technology, applied in the field of pattern recognition, can solve the problems of lack of action behavior patterns, low action recognition rate, and poor robustness, so as to avoid instability and inaccuracy of depth information, improve accuracy and Robustness, good representation effect

Active Publication Date: 2016-01-13
蜂鸟创新(北京)科技有限公司
View PDF4 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] The technical problem to be solved by the present invention is to provide a method and device for extracting motion features, so as to solve t

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Motion characteristics extraction method and device
  • Motion characteristics extraction method and device
  • Motion characteristics extraction method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0049] see figure 1 As shown, an action feature extraction method provided in an embodiment of the present invention includes:

[0050] S1, acquiring three-dimensional human skeleton data;

[0051] S2, according to the obtained three-dimensional human skeleton data, in the local coordinate system, store the organization of the skeleton model through a tree structure, and construct a limb tree model;

[0052] S3, according to the constructed limb tree model, combined with motion history images and motion energy images, Hu invariant moments describing the characteristics of human motion are obtained.

[0053]The action feature extraction method described in the embodiment of the present invention obtains three-dimensional human skeleton data; according to the obtained three-dimensional human skeleton data, in the local coordinate system, the organization of the skeleton model is stored through a tree structure, and a limb tree model is constructed; The constructed limb tree mo...

Embodiment 2

[0097] The present invention also provides a specific implementation of an action feature extraction device. Since the action feature extraction device provided by the present invention corresponds to the specific implementation of the aforementioned action feature extraction method, the action feature extraction device can perform the above method. The purpose of the present invention is realized by the process steps in the above, so the explanations in the specific implementation of the above-mentioned action feature extraction method are also applicable to the specific implementation of the action feature extraction device provided by the present invention, in the following specific embodiments of the present invention Will not repeat them.

[0098] Such as Figure 6 As shown, the embodiment of the present invention also provides an action feature extraction device, including:

[0099] Obtaining module 101: for obtaining three-dimensional human skeleton data;

[0100] Con...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a motion characteristics extraction method and device. The motion recognition accuracy and robustness can be improved. The method comprises the following steps: obtaining three-dimensional human skeleton data; carrying out storage on tissues of a skeleton model through a tree structure under a local coordinate system according to the obtained three-dimensional human skeleton data, and building a limb tree model; and combining a motion history image with a motion energy image according to the built limb tree model to obtain an Hu invariant moment of describing human motion characteristics. The device comprises an obtaining module, a building module and a motion characteristics extraction module, wherein the obtaining module is used for obtaining the three-dimensional human skeleton data; the building module is used for carrying out storage on the tissues of a skeleton model through the tree structure under the local coordinate system according to the obtained three-dimensional human skeleton data, and building the limb tree model; and the motion characteristics extraction module is used for combining the motion history image with the motion energy image according to the built limb tree model to obtain the Hu invariant moment of describing the human motion characteristics. The motion characteristics extraction method and device are suitable for the technical field of mode recognition.

Description

technical field [0001] The invention relates to the technical field of pattern recognition, in particular to an action feature extraction method and device. Background technique [0002] In 2008, Bill Gates, the founder of Microsoft, proposed the concept of "natural user interface". He predicted that the interaction mode and interface of human-computer interaction will undergo major changes in the next few years. Traditional Windows touch devices such as keyboards and mice will be replaced by more natural touch, voice-controlled interfaces and visual interfaces. Including Sony's Morpheus, Google Glass, and Microsoft's Kinect have brought people a virtual world sensory experience. These somatosensory devices are a huge advancement in the field of human-computer interaction. [0003] Human action feature extraction is an important content in the field of human-computer interaction. Its purpose is to let the computer describe the human action reasonably, and then it can automa...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00
CPCG06V20/64G06V40/23G06V40/103
Inventor 班晓娟杨光
Owner 蜂鸟创新(北京)科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products