Supercharge Your Innovation With Domain-Expert AI Agents!

CNN-BiGRU neural network fusion-based sign language recognition method

A neural network and recognition method technology, applied in the field of sign language recognition based on CNN-BiGRU neural network fusion, can solve problems such as difficult training, ignoring future sequence information, and inability to feature, so as to improve recognition accuracy and reduce network parameters. , The effect of reducing the cost of equipment

Active Publication Date: 2021-06-01
HEFEI UNIV OF TECH
View PDF3 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the complexity of the background information of the captured images increases the difficulty of identifying sign language gestures and hand positions, and it is difficult to extract sufficient and effective depth information from a single image, which affects the accuracy of sign language recognition
However, both myoelectric sensors and wearable data gloves are required to be worn, which is inconvenient to use, and the health protection issues of putting them in public places need to be considered under the epidemic situation, and there are considerable limitations in the promotion of actual use.
[0004] At present, with the rise of artificial intelligence, deep learning has gradually penetrated into various fields, and sign language recognition technology has gradually turned to the field of deep learning, and has begun to achieve good results, but the recognition technology for sign language is still relatively small and immature
Traditional deep learning methods for sign language recognition mainly include convolutional neural network (CNN) and long-term short-term memory network (LSTM). CNN-based sign language recognition systems are mostly limited to local features and cannot deep learn pooled features; LSTM networks consider The past feature sequence is ignored, but the future sequence information is ignored, and the network structure is complex, which makes training difficult

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • CNN-BiGRU neural network fusion-based sign language recognition method
  • CNN-BiGRU neural network fusion-based sign language recognition method
  • CNN-BiGRU neural network fusion-based sign language recognition method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0046] In this embodiment, a sign language recognition method based on CNN-BiGRU neural network fusion algorithm, such as figure 1 shown, including the following steps:

[0047] Step 1: Use the depth camera device Leap Motion to obtain the coordinates F of the thumb position 1 , index finger position coordinates F 2 , the position coordinates of the middle finger F 3 , the position coordinates of the ring finger F 4 , the position coordinates of the little finger F 5 , palm center position P C , palm stable position P S , palm speed v, palm pitch angle Pitch, palm yaw angle Yaw, palm roll angle Roll, hand ball radius r, palm width P W A variety of sign language data is formed; and each sign language data has a corresponding category label; a sign language data set is composed of a variety of sign language data and their category labels;

[0048] In the specific implementation, the operation of acquiring the sign language data set is as follows: 10 acquisition objects, 1...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a CNN-BiGRU neural network fusion-based sign language recognition method. The method comprises the steps of 1, collecting sign language data and adding labels to make a sign language data set; 2, performing data preprocessing on the sign language data set; 3, dividing the enhanced feature data into a training data set, a verification data set and a test data set; 4, establishing a CNN-BiGRU deep neural network model with fusion of the one-dimensional CNN and the BiGRU; and 5, collecting sign language data in real time, preprocessing the sign language data, and inputting the preprocessed sign language data into the final model to obtain a sign language classification result. The space-time information of the sign language feature sequence can be fully utilized, and the recognition precision of the whole model is improved, so that sign language recognition and classification can be effectively and accurately realized.

Description

technical field [0001] The invention relates to the field of sign language recognition, in particular to a sign language recognition method based on CNN-BiGRU neural network fusion. Background technique [0002] Sign language recognition highlights the semantic reconstruction characteristics of dynamic spatial information in sign language interaction under the background of increasingly diversified information transmission methods in today's intelligent human-computer interaction, and directly hits the pain point of demand: there are more than 20 million deaf-mute people in my country, With a large base and a low literacy rate, sign language, as a means of communication in daily life, has a wide range of uses and a large demand for translation. However, the slow development of the sign language translation industry, weak social training, and lack of infrastructure have led to a shortage of high-level sign language interpreters. Secondly, online sign language translation has ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/084G06V40/28G06N3/045G06F18/25G06F18/214
Inventor 李桢旻祝东疆苏彦博贺子珊鲁杰彭靖宇杜高明王晓蕾
Owner HEFEI UNIV OF TECH
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More