Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Gesture identification method based on deep residual error network

A gesture recognition and residual technology, applied in biometric recognition, character and pattern recognition, input/output process of data processing, etc., can solve problems such as gradient dispersion

Inactive Publication Date: 2017-07-28
HANGZHOU DIANZI UNIV
View PDF7 Cites 23 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to provide a gesture recognition method based on a deep residual network, which can effectively solve the problem of gradient dispersion and network accuracy, and at the same time effectively contain the problem of accuracy decline, reducing the training difficulty of the deep network. It greatly improves the accuracy of gesture recognition, and provides a new solution for gesture recognition and even image detection and object recognition. At the same time, this method ensures the accuracy of gesture recognition by inputting multi-dimensional data, and for the input data And the universality of the data format is more powerful, which effectively solves the limitation that the input gesture data can only be one-dimensional

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Gesture identification method based on deep residual error network
  • Gesture identification method based on deep residual error network
  • Gesture identification method based on deep residual error network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] In order to make the above-mentioned features and advantages of the present invention more comprehensible, the following specific examples are given and described in detail in conjunction with the accompanying drawings as follows.

[0035] The present invention provides a gesture recognition method based on a deep residual network, such as figure 1 As shown, the method includes a training phase and a recognition phase; the training phase includes the following steps:

[0036] The first step is to obtain the original data information of gestures, in which the present invention acquires 5,000 initial databases for gesture recognition; and after collecting various gesture pictures at the initial stage, mark each gesture picture with N points to obtain 2N-dimensional label data , and save the gesture picture as JPG format, wherein N≥1, and the value of N is mainly determined by the marking personnel; the specific schematic diagram of the gesture picture marking of the prese...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a gesture identification method based on a deep residual error network. The gesture identification method based on a deep residual error network includes the steps: 1) acquiring the original data information of gesture, and marking N points to obtain the 2N-dimensional original tag data; 2) preprocessing the original data information and the 2N-dimensional tag data; 3) taking the preprocessed original data information and the 2N-dimensional tag data converted in the hdf5 format as the original training data, inputting the original training data to a deep residual error network to train network parameters, and obtaining a gesture identification model; 4) performing N-point marking being identical to the step 1 on the gesture to be identified to obtain the 2N-dimensional tag data to be identified; and 5) preprocessing the gesture data to be identified and the 2N-dimensional tag data to be identified, inputting the processed gesture data and the 2N-dimensional tag data to be identified converted in the hdf5 format to the gesture identification model for identification, and then obtaining an identification result. The gesture identification method based on a deep residual error network effectively solves the gradient dispersion problem and the network accuracy problem.

Description

technical field [0001] The invention relates to the fields of image processing and object retrieval, in particular to a gesture recognition method based on a deep residual network. Background technique [0002] At present, gesture interaction based on vision has been extensively researched at home and abroad, but due to the ambiguity, space-time differences of the gesture itself, the high latitude, multiple degrees of freedom of the hand, and the incompatibility of the visual problem itself, gesture recognition based on Human interaction platforms for human interaction have always been an inherently challenging research topic. The core point of this project is the recognition of the opponent's shape and the tracking of gestures. There are two main directions in the traditional method: (1) The method based on the hidden Markov chain: GRobel extracts features from the recorded video of the actor wearing colored gloves, and uses the hidden Markov model (Hidden Markov Model, HM...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06F3/01G06K9/62G06K9/46G06N99/00
CPCG06F3/017G06N20/00G06V40/10G06V40/117G06V10/40G06F18/214
Inventor 谢益峰颜成钢王雁刚邵碧尧项露萱
Owner HANGZHOU DIANZI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products