Dynamic gesture trajectory recognition method based on deep convolutional neural network

A gesture trajectory, deep convolution technology, applied in character and pattern recognition, instruments, calculations, etc., can solve problems such as inaccuracy and rough recognition results, and achieve the effect of increasing dimensions and specific and meticulous work.

Inactive Publication Date: 2019-03-29
BEIJING GAOKE ZHONGTIAN TECH DEV
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this scheme is still based on the hidden Markov model, and is only interested in the state-space relationship composed of the trajectory point sequence, resulting in multiple cases being classified into the same type, and the recognition results are rough and not accurate enough.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Dynamic gesture trajectory recognition method based on deep convolutional neural network
  • Dynamic gesture trajectory recognition method based on deep convolutional neural network
  • Dynamic gesture trajectory recognition method based on deep convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0055] Embodiments of the present invention are described in detail below, and examples of the embodiments are shown in the drawings, wherein the same or similar reference numerals denote the same or similar elements or elements having the same or similar functions throughout. The embodiments described below by referring to the figures are exemplary and are intended to explain the present invention and should not be construed as limiting the present invention.

[0056] An embodiment of the present invention provides a dynamic gesture trajectory recognition method based on a deep convolutional neural network. After preprocessing the gesture trajectory point sequence, the method performs shape and direction recognition on the gesture trajectory point sequence, and fuses the recognition results to generate a more accurate gesture trajectory. Fine dynamic gesture type discrimination results.

[0057] Such as figure 1 and figure 2 As shown, the dynamic gesture trajectory recogni...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The present invention proposes a dynamic gesture trajectory recognition method based on a deep convolutional neural network, including: collecting the original input gesture trajectory point sequence for preprocessing, detecting and eliminating abnormal points in the gesture trajectory point sequence; The trajectory point sequence is marginalized to generate a normalized gesture trajectory map, and the depth features of the normalized gesture trajectory map are extracted using the trained deep convolutional neural network model; the corresponding gesture trajectory is identified using the trained support vector machine The shape type of the point sequence; use the tree classifier to divide the unknown direction type according to the shape type of the gesture track point sequence; fuse the recognized shape type and direction type to generate the fusion track recognition result of the gesture track point sequence. The present invention adopts shape recognition and direction recognition to provide dynamic gesture recognition services with directions for gesture track point sequences, and the dynamic gesture track recognition work is not affected by time and space differences, and the classification is more detailed.

Description

technical field [0001] The invention relates to the technical fields of computer vision and pattern recognition, in particular to a dynamic gesture track recognition method based on a deep convolutional neural network. Background technique [0002] With the continuous emergence of new technologies of artificial intelligence and new technologies of input and output equipment, human-computer interaction technology is rapidly moving towards the direction of intelligent automation, from the original computer-centered mechanical interaction technology to human-centered multi-channel multimedia intelligence Interactive technology comes up. These new human-computer interaction technologies get rid of the shackles of the old mechanical interaction, and are more and more popular among the audience, such as skin displays, fingerprint or corneal recognition security protection, eye movement interaction devices, etc. [0003] Gesture, as a common communication method for people, is a n...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/28G06F18/2411G06F18/214
Inventor 马俊杰赵晓轲牛建伟陈孟斌欧阳真超
Owner BEIJING GAOKE ZHONGTIAN TECH DEV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products