Gesture recognition method based on regional full convolutional network

A fully convolutional network and gesture recognition technology, applied in neural learning methods, character and pattern recognition, biological neural network models, etc., can solve the problems of complex processing and low efficiency, and achieve high rejection rate and avoid recognition rate. Effect

Active Publication Date: 2019-10-15
GUANGDONG UNIV OF TECH
View PDF3 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

These traditional gesture recognition methods must manually set features, and then extract these features for gesture recognition, which has the disadvantages of complex processing and low efficiency.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Gesture recognition method based on regional full convolutional network
  • Gesture recognition method based on regional full convolutional network
  • Gesture recognition method based on regional full convolutional network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0036] The present invention provides a gesture recognition method based on a regional full convolution network, comprising the following steps:

[0037] Step 1, build a fully convolutional network

[0038] In this solution, the residual network ResNet-34 network architecture is used as the skeleton, the step size of the RerNet-34 network is changed from 32 pixels to 16 pixels, the average pooling layer and the fully connected layer of the ResNet-34 network architecture are deleted, and then A fully convolutional network is built using the convolutional layers of the ResNet-34 network architecture to extract features from the input image.

[0039] Such as figure 1 As shown, the fully convolutional network in this scheme consists of two parts. The first part is a convolution layer with a convolution kernel size of 7*7 to process the input image, and the second part is four groups of different convolution kernels composed of 3*3. The deep residual block, the residual block is...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a gesture recognition method based on a regional full convolutional network. The method comprises: for an input gesture image, performing feature extraction through a full convolutional network to obtain a set of feature maps and generate a candidate box, a position sensitive sub-network generating a position sensitive score map, and scoring each gesture category through apooling layer, so that positioning and classification of target gestures are realized. The method is mainly characterized in that the whole regional full convolutional network is of a shared full convolutional structure. The whole structure is end-to-end learning, complex calculation is avoided while the high-precision recognition rate is achieved, the OHEM technology is combined. The network model has the higher rejection rate on negative samples, practical application is facilitated, and the method has important significance in the field of human-computer interaction.

Description

technical field [0001] The invention relates to the technical fields of computer vision, machine learning and pattern recognition, in particular to an end-to-end gesture recognition method using a regional full convolution network. Background technique [0002] At present, as many VR (Virtual Reality) and AR (Augmenting Reality) become more and more popular, more and more people pay attention to human-computer interaction technology. Gesture, as the most direct and convenient way of human-computer interaction, has attracted the attention of many researchers, and gesture recognition has gradually become an important research direction in the field of computer vision. How the computer accurately recognizes the meaning of gestures is an important part of the gesture human-computer interaction system. Since the human hand is a complex deformation, gestures have the characteristics of diversity, ambiguity, and time differences, and gestures are usually in a complex state. Gestur...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/32G06K9/62G06N3/04G06N3/08
CPCG06N3/084G06V40/28G06V10/25G06N3/045G06F18/2414
Inventor 杨锦
Owner GUANGDONG UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products