Human body motion recognition method based on three-dimensional bone information

A human action recognition and skeleton technology, applied in the field of human action recognition based on three-dimensional skeleton information, can solve problems such as effective distance dependence, high equipment cost, and easy to be affected by lighting conditions, occlusion and shadows, etc.

Active Publication Date: 2016-10-12
NORTH CHINA UNIVERSITY OF TECHNOLOGY
View PDF10 Cites 87 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] Most of the traditional action recognition methods are human action recognition based on two-dimensional image sequences, which are easily affected by lighting conditions, occlusion and shadows, and can only be obtained when the color of the human body clothes is very different from the background color or there is no occlusion. better recognition result
Although some depth cameras can obtain three-dimensional information, the effective distance of the stereo camera only depends on the baseline setting and the quality of ambient light in the scene. The TOF camera relies on the reflection of light, and its effective distance depends on the range of light emission and acceptance, and High equipment cost

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body motion recognition method based on three-dimensional bone information
  • Human body motion recognition method based on three-dimensional bone information
  • Human body motion recognition method based on three-dimensional bone information

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0044] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0045] The human action recognition method based on three-dimensional skeleton information provided by the present invention comprises the following steps:

[0046] S1: Fix the Kinect depth sensor on the camera bracket parallel to the ground to ensure that the Kinect depth sensor is parallel to the ground level and has a certain vertical height, and can capture the complete human target in the scene;

[0047] S2: Use the Kinect depth sensor to collect multiple...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a human body motion recognition method based on three-dimensional bone information. The method comprises the steps that color data stream, depth data stream and bone data stream of a number of individual samples of different sex and height are processed, wherein the individual samples make various motions; SVM models corresponding to all motions are constructed; the bone data stream of a recognized target which makes any motion within the acquisition range of a Kinect depth sensor is acquired; through the bone data stream, the distance between a normalized human body bone articulation point and a reference point and 14 vector angles are acquired; the data are respectively input to multiple SVM models; and a motion corresponding to the SVM model with the highest output probability is the motion of a recognition target. According to the invention, the Kinect depth sensor is used to acquire images; the method is less affected by light condition, shadow and other factors; the depth map and the bone information of a human body motion are acquired in real time; and a human body target in a scene can be accurately located.

Description

technical field [0001] The invention relates to the field of human action recognition, in particular to a human action recognition method based on three-dimensional skeleton information. Background technique [0002] The main task of action recognition is to extract the feature data representing different actions from the depth map, and when different human bodies express the same action, there are differences in shape, clothing, and exercise habits. Therefore, how to select an effective action feature description is one of the key issues in action recognition. Since human body movements can be simply divided into upper limb movements, lower limb movements, and torso movements, and the self-occlusion problem exists during the human body's own movement process, this makes the extracted feature data redundant, which will also affect the subsequent action recognition results. Therefore, it is particularly important to select appropriate action feature data to distinguish diffe...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62
CPCG06V20/64G06V40/23G06F18/2411
Inventor 叶青张丽张永梅
Owner NORTH CHINA UNIVERSITY OF TECHNOLOGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products