Dynamic gesture recognition method

A technology of dynamic gestures and recognition methods, applied in the fields of computer vision and machine learning, to achieve the effect of improving accuracy

Inactive Publication Date: 2015-12-30
BEIJING UNIV OF TECH
View PDF4 Cites 49 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Traditional feature representation methods are usually artificially predefined features, and the method of manually selecting features has great limitations. I

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Dynamic gesture recognition method
  • Dynamic gesture recognition method
  • Dynamic gesture recognition method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0010] This dynamic gesture recognition method, the method comprises the following steps:

[0011] (1) Preprocess the dynamic gesture data, expand the dynamic gesture data based on the interval sampling method, calculate and expand the edges of the RGB three channels of the expanded dynamic gesture data based on the canny edge detection operator, and generate a color edge image;

[0012] (2) Extract gesture feature sequence based on convolutional neural network model;

[0013] (3) Through the gesture feature sequence and hand direction feature extracted in step (2), conduct hidden Markov model HMM training to obtain the HMM closest to the gesture sample.

[0014] The present invention extracts the gesture feature sequence based on the convolutional neural network model, and then performs hidden Markov model HMM training to obtain the HMM closest to the gesture sample, thereby improving the accuracy of dynamic gesture recognition.

[0015] Preferably, the interval sampling met...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a dynamic gesture recognition method which can increase the accuracy of dynamic gesture recognition. The method comprises the following steps: (1) pre-processing dynamic gesture data, expanding the dynamic gesture data based on an interval sampling method, and calculating edges of three channels of RGB of expanded dynamic gesture data based on a canny edge detection operator, so as to generate an image with colored edges; (2) extracting a gesture feature sequence based on a convolutional neural network model; (3) carrying out HMM (Hidden Markov Model) training according to the gesture feature sequence extracted in the step (2) and hand direction features, so as to obtain an HMM most approximate to a gesture sample.

Description

technical field [0001] The invention belongs to the technical field of computer vision and machine learning, and in particular relates to a dynamic gesture recognition method. Background technique [0002] Gesture recognition is an important research direction in the field of computer vision. Gesture recognition research involves computer vision, pattern recognition, image processing, machine learning and other multidisciplinary fields, and is a challenging subject. Vision-based gesture recognition has important applications in the fields of deaf-mute education and teaching, robot control, virtual reality, human-computer interaction, and smart home because of its naturalness. Two key research contents in vision-based gesture recognition are feature representation and classification methods. Commonly used feature representation methods include SIFT (Scale-InvariantFeatureTransform, scale-invariant feature transformation), SURF (Speed-UpRobustFeature, accelerated scale-invar...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06N3/08
CPCG06N3/088G06V40/107G06V40/28
Inventor 孙艳丰邢迎新李敬华孔德慧王立春王文通
Owner BEIJING UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products