Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Regional convolutional neural network-based efficient gesture detection and recognition method

A technology of convolutional neural network and gesture detection, which is applied in the field of efficient gesture detection and recognition based on regional convolutional neural network, can solve the problems of insufficient precision, limited input image size, and low accuracy, so as to improve the speed and accuracy of recognition rate effect

Inactive Publication Date: 2018-09-07
DONGHUA UNIV
View PDF6 Cites 39 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In 2014, Girshick R. proposed a region-based convolutional neural network model R-CNN, which generates candidate regions based on SelectiveSearch or Edge boxes, and then uses convolutional neural networks to extract features from the generated candidate regions. Although there are insufficient precision and input The problem of image size limitation, but it laid the foundation for the idea of ​​RPN+CNN in target detection
Then in 2015, Girshick R. proposed the Fast R-CNN model, proposed the Region of Interest Pooling layer, and improved the shortcomings of R-CNN, but because the characteristics of the network to the target are still manually designed, and the calculation work is only in the CPU. The low accuracy of the model and the long calculation time of the candidate area still become the disadvantages of the network

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Regional convolutional neural network-based efficient gesture detection and recognition method
  • Regional convolutional neural network-based efficient gesture detection and recognition method
  • Regional convolutional neural network-based efficient gesture detection and recognition method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] Below in conjunction with specific embodiment, further illustrate the present invention. It should be understood that these examples are only used to illustrate the present invention and are not intended to limit the scope of the present invention. In addition, it should be understood that after reading the teachings of the present invention, those skilled in the art can make various changes or modifications to the present invention, and these equivalent forms also fall within the scope defined by the appended claims of the present application.

[0027] Embodiments of the present invention relate to a static sign language real-time recognition method based on an improved single multi-target detector, such as figure 1 As shown, it includes the following steps: preprocessing the sample image of Chinese gesture letters; constructing and strengthening the gesture image data set, which is divided into training set and test set; using the Faster R-CNN network based on regiona...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a regional convolutional neural network-based efficient gesture detection and recognition method. The method includes following steps that: Chinese character gesture alphabetsample images are preprocessed; a gesture image data set is constructed and strengthened; a regional convolutional neural network-based Faster R-CNN is utilized to perform gesture detection and recognition: a feature extraction network extracts gesture features and divides an extracted feature graph into two parts, the first part directly enters a Fast R-CNN for deep convolution, the second part enters an RPN, a regional proposal is generated, the regional proposal is inputted into the Fast R-CNN, the regional proposal and a feature graph obtained in the first part together enter a RoI poolinglayer and then enter a full joint layer, position regression and gesture category scores are obtained, and finally, gesture detection and recognition can be realized; and the network model is trained, Chinese character gesture alphabet detection and recognition can be realized. The method of the invention can improve recognition speed and accuracy.

Description

technical field [0001] The invention relates to the technical field of gesture detection and recognition, in particular to an efficient gesture detection and recognition method based on a regional convolutional neural network. Background technique [0002] In recent years, gesture recognition has been applied in a wide range of fields, such as gesture translation for deaf-mute people, robot control for gesture recognition and photography, and smart home control for door and window appliances. According to the gesture collection method, there are two classification methods for gesture recognition: one is based on wearable technology, and the other is based on machine vision. Although the gesture recognition technology based on wearable devices has the advantages of accurate gesture positioning, relatively simple data, and fast response processing speed, it cannot make up for the disadvantages of high cost, inconvenient operation, high learning cost, limited operating distance...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62G06N3/08
CPCG06N3/084G06V40/113G06F18/24
Inventor 张勋陈亮朱雪婷
Owner DONGHUA UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products