Three-dimensional gesture action recognition method based on depth images

A depth image and gesture action technology, applied in image analysis, image data processing, character and pattern recognition, etc., can solve problems such as unfavorable action template expansion, complex gesture action recognition application, and recognition accuracy dependent on samples, etc. The effect of recognition rate, disambiguation, and recognition process accuracy

Inactive Publication Date: 2014-03-26
INST OF AUTOMATION CHINESE ACAD OF SCI
View PDF3 Cites 35 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This method often needs to collect a large amount of sample data, and requires manual labeling, and the accuracy of recognition largely depends on the number of samples
This makes the gesture recognition application complicated, which is not conducive to the expansion of action templates and other applications

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Three-dimensional gesture action recognition method based on depth images
  • Three-dimensional gesture action recognition method based on depth images
  • Three-dimensional gesture action recognition method based on depth images

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0023] The invention proposes a three-dimensional gesture recognition method based on a depth image. Hereinafter, the present invention will be described with reference to the drawings.

[0024] figure 1 It is a flow chart showing a method for recognizing a three-dimensional gesture action based on a depth image according to an embodiment of the present invention.

[0025] Such as figure 1 As shown, the method for recognizing a three-dimensional gesture action based on a depth image includes the following steps.

[0026] Step 1: Acquire a sequence of depth images containing depth images of hand gestures.

[0027] That is, a sequence of depth images including depth images of gestures is obtained through the camera. The present invention is not limited to obtain the depth image from the camera, it may also be the depth image saved in the memory.

[0028] Step 2: Background culling.

[0029] In this step, background removal is performed on each depth image of the depth imag...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a three-dimensional gesture action recognition method based on depth images. The three-dimensional gesture action recognition method comprises the steps of acquiring the depth images including gesture actions; dividing a human body region corresponding to the gesture actions from the images through tracking and positioning based on quick template tracking and oblique plane matching to obtain a depth image sequence after the background is removed; extracting useful frames of the gesture actions according to the depth images after the background is removed; calculating three-view drawing movement historical images of the gesture actions in the front-view, top-view and side-view projection directions according to the extracted useful frames; extracting direction gradient histogram features corresponding to the three-view drawing movement historical images; calculating relevance of combination features of the obtained gesture actions and gesture action templates stored in a pre-defined gesture action library; using a template with largest relevance as a recognition result of a current gesture action. Therefore, three-dimensional gesture action recognition can be achieved by adopting the three-dimensional gesture action recognition method, and the three-dimensional gesture action recognition method can be applied to recognition of the movement process of simple objects.

Description

technical field [0001] The invention relates to the fields of computer vision and digital image processing, in particular to a recognition method for three-dimensional gestures based on depth images. Background technique [0002] At present, gesture recognition technology is one of the hot research directions in the field of computer vision and pattern recognition in recent years, and it is also one of the widely used research points in this field. Especially for the recognition of three-dimensional gestures, because it can be widely used in production and life, it has been highly valued by many important research institutions in the world, which fully reflects its research value and significance. The recognition of three-dimensional gestures includes the recognition of dynamic gestures and human body movements, and is recognized in three-dimensional space. Dynamic gestures not only express the state of a certain part of the body at a certain moment, but also include the ti...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06T7/00
Inventor 蒋永实秦树鑫
Owner INST OF AUTOMATION CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products