Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human body movement identification method based on Kinect

A human action recognition, human body technology, applied in the input/output of user/computer interaction, computer parts, graphics reading, etc. The effect of human-computer interaction, the effect of improving speed and accuracy, and reducing the complexity of the system

Active Publication Date: 2014-01-22
合肥金诺数码科技股份有限公司
View PDF5 Cites 26 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] At present, some methods of using motion recognition and control are based on traditional two-dimensional images. This type of method not only involves general image processing, but also may involve image understanding and description, which is relatively complicated.
Moreover, this type of method is sensitive to the surrounding lighting environment, and poor lighting conditions may affect the accuracy of recognition; clothing, accessories and other occluders worn by the human body will reduce some local features of human movements, which will affect the recognition of movements, and even Unrecognizable, although it can be compensated by artificial intelligence, it is more complicated and has an impact on real-time performance; in terms of interactive applications, this type of method only uses two-dimensional image information, and cannot robustly deal with the interaction of human body parts in the depth direction action

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body movement identification method based on Kinect
  • Human body movement identification method based on Kinect

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0023] Such as figure 1 Shown, a kind of human action recognition method based on Kinect comprises the steps of following order:

[0024] (1) Use Kinect to collect the spatial position information of the target human bone joint points at different times;

[0025] (2) For the acquired spatial position information of the target human skeleton joint points at each moment, judge whether it matches the preset initial position information of various human movements; if so, record this moment as the initial moment, and execute the steps (3), if not, return to step (1);

[0026] (3) Starting from the initial moment, judge whether the spatial position information of the target human bone joint points acquired within a period of time meets the preset judgment standards of various human actions; if yes, perform step (4), if not, return step 1);

[0027] (4) Identify the action type of the target human body, and then return to step (1).

[0028] Next, the present invention will be fur...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a human body movement identification method based on a Kinect. The human body movement identification method includes the steps of obtaining spatial position information of a skeleton articulation point of a target human body through the Kinect, and then identifying a movement type of the target human body by judging whether the spatial position information meets a preset judging standard of various human body movements or not. By means of the human body movement identification method, the advantages of the Kinect are effectively used, the spatial position information of the skeleton articulation point of the human body can be obtained in real time, other sensing devices are not required to be used in an assisting mode, picture processing is not carried out, the systematic complexity is reduced, the human movement identification speed and accuracy is improved, and the man-machine interaction effect is improved.

Description

technical field [0001] The invention relates to the technical field of computer virtual reality, in particular to a Kinect-based human action recognition method. [0002] Background technique [0003] With the development of computer technology, in some virtual reality fields, such as digital entertainment, it is usually necessary for the computer to transmit stimulation signals to the human senses on the one hand, and to receive the operator's response, such as changes in body posture, on the other hand, and then Stimulation signals are adjusted according to these changes in the position and state of the human body itself. Therefore, it is necessary to accurately recognize human actions. [0004] At present, some methods of using motion recognition and control are based on traditional two-dimensional images. This type of method not only involves general image processing, but also may involve image understanding and description, which is relatively complicated. Moreover, ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01
Inventor 田地
Owner 合肥金诺数码科技股份有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products