Human motion recognition feature expression method based on depth mapping

A technology of human body activity and depth mapping, applied in the field of pattern recognition, can solve the problems of complex LDP generation, limited accuracy, and large amount of calculation, and achieve the effects of small amount of calculation, high accuracy, and overcoming inaccuracy

Inactive Publication Date: 2016-05-18
YUNCHENG UNIVERISTY
View PDF1 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, algorithms based on 2D color information are vulnerable to challenges such as illumination changes and complex backgrounds, and algorithms based on 3D depth information are limited by depth map noise and the accuracy of bone point extraction.
The invention patent with the publication number CN104091167A and the name of the feature extraction method of human activity recognition based on the somatosensory camera discloses the feature extraction method using the somatosensory camera, but its LDP generation is relatively complicated and the amount of calculation is relatively large. Therefore, it is necessary to carry out Improve

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human motion recognition feature expression method based on depth mapping
  • Human motion recognition feature expression method based on depth mapping

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0018] The following embodiments will further describe the present invention in conjunction with the accompanying drawings.

[0019] Such as figure 1 As shown, a feature expression method for human activity recognition based on a somatosensory camera includes the following steps:

[0020] S1: From the color image sequence of the somatosensory camera, extract the spatiotemporal interest point p of human activities, and its coordinates are (x, y, t);

[0021] S2: Centering on the spatiotemporal interest point p of human activities, calculate the optical flow histogram HOF feature and gradient histogram HOG feature in the color image sequence;

[0022] S3: From the depth image sequence of the somatosensory camera, find the corresponding human activity spatiotemporal interest point p', and obtain the depth value of p';

[0023] S4: Based on the Gaussian mixture model, divide p into N layers according to the depth value of p', construct a multi-channel expression based on depth m...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to the technical field of pattern recognition and specifically to a human motion recognition feature expression method based on depth mapping. The human motion recognition feature expression method based on depth mapping is easy to implement, increases the accuracy of motion recognition, and comprises steps of: extracting, from a color image sequence of a motion-sensing video camera, a human motion space-time interest point p; computing a light stream feature and a gradient feature in the color image sequence by revolving around the human motion space-time interest point p; finding out, from a depth image sequence of the motion-sensing video camera, a corresponding human motion space-time interest point p' and acquiring the depth value of the p'; dividing the p into N layers according to the depth value of the p' and on the basis of a Gaussian mixture model, constructing multichannel expression based on depth mapping, clustering the space-time interest point of each layer by using a clustering algorithm and expressing the space-time interest point of each layer by using a bag-of-word model to obtain a histogram vector; and connecting the feature expression of each layer to form a feature S=(H1,..,Hi,..,Hn). The method is mainly used in the aspect of human motion recognition.

Description

technical field [0001] The present invention relates to the technical field of pattern recognition, and more specifically, to a feature expression method for human body activity recognition based on depth mapping. technical background [0002] Human activity understanding has been widely used in sports training, rehabilitation engineering, ergonomics, game and animation production, security monitoring, etc. The core technology of human activity understanding is mainly target recognition algorithm. So far, activity recognition algorithms based on 2D color information and 3D depth information have achieved some success. However, algorithms based on 2D color information are vulnerable to challenges such as illumination changes and complex backgrounds, and algorithms based on 3D depth information are limited by depth image noise and the accuracy of bone point extraction. The invention patent with the publication number CN104091167A and the name of the feature extraction method...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/46
CPCG06V40/23G06V10/464
Inventor 赵润林赵洋
Owner YUNCHENG UNIVERISTY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products