Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Gesture recognition method based on deep convolutional neural networks

A neural network and deep convolution technology, applied in the field of gesture recognition based on deep convolutional neural network, can solve the problems of large amount of data, difficult processing, and high computational complexity, and achieve the effect of improving accuracy and efficiency

Inactive Publication Date: 2017-12-15
CHINA JILIANG UNIV
View PDF2 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The input of gait recognition is a sequence of walking video images, which has a large amount of data, resulting in high computational complexity and difficult processing

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Gesture recognition method based on deep convolutional neural networks
  • Gesture recognition method based on deep convolutional neural networks
  • Gesture recognition method based on deep convolutional neural networks

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0018] The present invention will be described in detail below in conjunction with the accompanying drawings.

[0019] Such as figure 1 As shown, the gesture recognition method based on the deep convolutional neural network of the present invention is divided into a training stage and a recognition stage.

[0020] In the training phase, a deep convolutional neural network is constructed, and the values ​​of its weight and offset parameters are determined using the training set data. Specifically include the following sub-steps:

[0021] 1. Extract the hand image from the training data: for a frame in the gesture interaction, such as figure 2 As shown in (a), the present invention first extracts a rough hand region image according to the skin color feature, such as figure 2 As shown in (b); secondly, the image of the hand area is corrected by filtering and fast ecological expansion and erosion algorithm, and a further hand image is obtained, such as figure 2 as shown in ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a gesture recognition method based on deep convolutional neural networks, and the method comprises the steps: (1) edge detection processing and sample set partition are performed on sample images of a training set; (2) deep convolutional neural networks are constructed; (3) an activation function and a loss function are determined; (4) deep neural networks are trained; (5) gesture recognition is realized according to the deep convolutional neural networks after training: wherein the steps comprises: a) a hand-type image is extracted from to-be-recognized gesture data; b) edge detection and size normalization processing are performed on the hand-type image; c) and the normalized hand-type image is inputted to the deep convolutional neural networks, a kind of the current gesture is judged according to an output value of an output layer. According to the invention, multiple downsampling technology is used to construct the deep convolutional neural networks, a hyperbolic tangent function is adopted to serve as an activation function to train neural network, the efficiency of gesture recognition can be improved, and accuracy of gesture recognition can be improved.

Description

technical field [0001] The invention relates to the field of biological feature recognition, in particular to a gesture recognition method based on a deep convolutional neural network. Background technique [0002] Biometric identification is one of the key technologies in the fields of video surveillance and security authentication. Biological characteristics can be divided into physiological characteristics and behavioral characteristics. Physiological features mainly include face, fingerprints, and irises, while behavioral features include gait, gestures, etc. Typical recognition methods based on physiological characteristics include fingerprint recognition, palm shape and contour recognition, face recognition, iris recognition, etc. Fingerprint recognition is currently one of the most widely used biometric-based identification methods. Fingerprint identification has the advantages of mature technology and low cost. Its disadvantage is that it is contact-type, invasiv...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04
CPCG06V40/113G06N3/045G06F18/214
Inventor 王修晖
Owner CHINA JILIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products