Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Three-dimensional depth data based dynamic gesture recognition method

A dynamic gesture and depth data technology, applied in the field of human-computer interaction, can solve problems such as limited recognition ability and poor dynamic gesture recognition results, and achieve the effect of improving accuracy

Inactive Publication Date: 2018-10-16
BEIJING INSTITUTE OF TECHNOLOGYGY
View PDF3 Cites 26 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The existing technical solution is to use the skeletal joint point information of the hand to extract feature vectors for recognizing different dynamic gestures, but it stores the three-dimensional position of the skeletal nodes of the hand in the form of a six-dimensional vector, and uses the six-dimensional vector and frame The velocity six-dimensional vector is obtained at a high rate, and the feature extraction is performed on the stored six-dimensional velocity vector list to obtain the time series of feature values ​​corresponding to the gesture to be recognized. The features extracted by this technical solution mainly describe the direction information of the dynamic gesture, which is very important when the gesture is completed. During the process, the direction of hand movement changes little, while the recognition results of dynamic gestures with large changes in the appearance and shape of the hand are poor.
At the same time, the characteristic time series of the scheme describes the global time information of dynamic gestures. Since the local time information of dynamic gesture sequences is not described, this will limit the ability of the scheme to recognize similar dynamic gestures.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Three-dimensional depth data based dynamic gesture recognition method
  • Three-dimensional depth data based dynamic gesture recognition method
  • Three-dimensional depth data based dynamic gesture recognition method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0050] The present invention will be described in detail below with reference to the accompanying drawings and examples.

[0051] The present invention provides a dynamic gesture recognition method based on three-dimensional depth data, the process of which is as follows figure 1 shown, including the following steps:

[0052]S1. Use the somatosensory controller Leap Motion to obtain the dynamic gesture sequence to be recognized, obtain the position coordinates of each skeletal joint point in each effective frame in the dynamic gesture sequence to be recognized, and extract the hand shape change feature vector and the motion direction feature vector by frame-by-frame extraction .

[0053] The hand bone structure provided by Leap Motion is as follows: figure 2 shown. Before officially starting the experimental task, the process and operation method of the task will be explained to the subjects to ensure that the extracted dynamic gesture database is valid.

[0054] Obtain t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a three-dimensional depth data based dynamic gesture recognition method, which can improve the recognition effect of the dynamic gesture recognition and Includes such steps asobtaining a dynamic gesture sequence to be recognized by a somatosensory controller Leap Motion, obtaining position coordinates of each skeletal joint in each effective frame of the dynamic gesture sequence to be recognized, and extracting a hand shape change feature vector and a motion direction feature vector frame by frame; solving the Fisher vector coding form of the hand shape change featurevector by a Gaussian mixture model; extracting feature vectors containing test gesture sequence local time information by using a time pyramid model; and classifying the feature vectors containing to-be-recognized gesture sequence local time information by a support vector machine SVM, matching with known feature vectors with local time information in an existing dynamic gesture database, and taking a dynamic gesture corresponding to the matching result as a recognition result.

Description

technical field [0001] The invention relates to the technical field of human-computer interaction, in particular to a dynamic gesture recognition method based on three-dimensional depth data. Background technique [0002] At present, in the field of human-computer interaction, dynamic gesture recognition is a hot human-computer interaction method in recent years. Dynamic gesture recognition technology is developing rapidly. [0003] In the prior art, the depth camera is used to collect the video bone stream, the three-dimensional position of the required bone node is obtained, the three-dimensional position of the bone node of the hand is stored in the form of a six-dimensional vector, and the six-dimensional vector of the speed is obtained through the six-dimensional vector and the frame rate. Do point multiplication of the six-dimensional velocity vector and the preset special direction unit vector, select the corresponding special direction unit vector with the largest i...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06F3/01
CPCG06F3/017G06V40/28G06F18/2411
Inventor 刘越赵丹王涌天李广传
Owner BEIJING INSTITUTE OF TECHNOLOGYGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products