Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Sign language translation method based on BP neural network

A BP neural network and sign language translation technology, applied in the field of sign language translation, can solve the problems of limiting the social circle of deaf-mute people, low price, high price, etc., to solve the problem of communication barriers, low cost of hardware equipment, and high translation accuracy Effect

Active Publication Date: 2020-07-17
SHANGHAI INST OF MEASUREMENT & TESTING TECH +1
View PDF4 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002]In the process of communication between the deaf-mute and normal people in the society, because normal people cannot understand sign language, there is a gap between the deaf-mute and normal people, which limits the deaf-mute The social circle of mute people brings great restrictions to their life and development space
There are two types of assistive devices for deaf-mute people on the market today. One is the electronic larynx that started in the 1950s. It is installed at the larynx joint to sense the vibration of the vocal cords and amplify it to help vocalization. However, the materials used for vocalization are expensive, and generally disabled people without social work protection cannot afford them at all.
The other is sign language translation equipment based on computer vision that has appeared in the past few years. This type of equipment is not expensive, but the body movement recognition technology is still in its infancy, and image processing has strict requirements on the acquisition environment.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Sign language translation method based on BP neural network
  • Sign language translation method based on BP neural network
  • Sign language translation method based on BP neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] According to attached figure 1 , the present invention is a kind of sign language translation method based on BP neural network, comprises the following steps:

[0028] Step 1: The Raspberry Pi 3B single-board computer collects the gesture voltage signal through the flexible sensor and the acceleration sensor on the wearable data glove, filters and amplifies the gesture voltage signal, and transmits it to the memory through its integrated Bluetooth module .

[0029] Step 2: Use the signal screening program to compile the sign language words and commonly used sign language sentences corresponding to each group of signals into the sign language library to make a sign language sentence library, and divide the gesture voltage signals collected many times and the corresponding sign language words into training at a ratio of 7:3 set and test set;

[0030] Step 2.1: Write sign language library recording software based on C# language, which can record each received gesture vo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a sign language translation method based on a BP neural network. The sign language translation method is characterized by comprising the following steps: 1, acquiring a gesturevoltage signal by utilizing Raspberry Pi 3B through a wearable data glove; 2, compiling sign language words and common sign language sentences corresponding to each group of gesture voltage signals into a sign language sentence library by utilizing a signal screening program; 3, writing a neural network classification program comprising a BP neural network structure framework model, a data transmission module and a storage module, wherein the BP neural network structure framework model adopts a three-layer neural network comprising an input layer, an output layer and a hidden layer; 4, converting the gesture voltage signals received each time into sign language words through a BP neural network framework model; and 5, converting the sign language words obtained in the step 4 into sign language word groups within a period of time, matching the sign language word groups with a sign language statement library, associating and filling into sentences, and outputting a result. According tothe invention, automatic real-time translation identification of sign language is realized by combining a neural network and a sensing technology.

Description

technical field [0001] The invention relates to a sign language translation method, and in particular discloses a BP neural network-based sign language translation method, which realizes sign language automatic translation recognition in combination with the neural network and sensing technology. Background technique [0002] In the process of communication between the deaf-mute and normal people in today's society, because normal people cannot understand sign language, there is a gap between the deaf-mute and normal people, which limits the communication circle of the deaf-mute and brings great space for their life and development. Big restrictions. There are two types of assistive devices for deaf-mute people on the market today. One is the electronic larynx that started in the 1950s. It is installed at the larynx joint to sense the vibration of the vocal cords and amplify it to help vocalization. However, the materials used to produce voices are expensive, and generally ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/063G06N3/08G06K9/00G06F3/01
CPCG06N3/084G06N3/063G06F3/014G06V40/28
Inventor 谢张宁朱惠臣孙晓光吴俊杰李智玮傅云霞雷李华孔明管钰晴刘娜王道档
Owner SHANGHAI INST OF MEASUREMENT & TESTING TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products