Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Deaf-mute sign language interaction method and deaf-mute sign language interaction device

An interactive method, a technology for the deaf-mute, applied in the direction of educational appliances, instruments, teaching aids, etc., can solve the problem that the deaf-mute cannot comprehensively meet the communication needs of the deaf-mute, the weight is heavy, and it is difficult to realize real-time and face-to-face wireless communication. Barriers to communication etc.

Pending Publication Date: 2018-11-23
深圳市漫牛医疗有限公司
View PDF7 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] However, the existing deaf-mute sign language interaction systems, such as robots, are large in size and heavy in weight.
Difficulty in achieving real-time, face-to-face barrier-free communication
Therefore, the existing social methods of the deaf-mute have been unable to fully meet the communication needs of the deaf-mute

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Deaf-mute sign language interaction method and deaf-mute sign language interaction device
  • Deaf-mute sign language interaction method and deaf-mute sign language interaction device
  • Deaf-mute sign language interaction method and deaf-mute sign language interaction device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0041] Please also refer to figure 1 as well as Figure 4 , the sign language interaction device for the deaf-mute in this embodiment includes a first wearable device 10 and a second wearable device 20 . The first wearable device 10 is provided with a first local connection module 32 . The second wearable device 20 is provided with a second local connection module 55 . The first local area connection module 32 and the second local area connection module 55 may be WIFI modules, or local area wireless network modules such as Bluetooth modules or Zigbee modules.

[0042] The first wearable device 10 also includes an image acquisition module 14 , a first wireless connection module 34 , a first identification module 35 , a first communication module 36 , a feature learning module 37 and a first system update unit 38 .

[0043] The image acquisition module 14 can be a camera module, which has pixel requirements that meet the needs of image analysis. Such as figure 1 As shown, the...

Embodiment 2

[0056] Please refer to figure 2 , the deaf-mute sign language interaction method of the present embodiment mainly includes the following steps:

[0057] The processing steps on the side of the first wearable device 10 are:

[0058] Step 111: Obtain sign language image information from the first wearable device 10, and extract sign language features from the sign language image information;

[0059] Step 112: According to the sign language text mapping database and the sign language voice mapping database, determine the text information and voice information corresponding to the sign language feature;

[0060] Step 113: Display text information and / or voice information corresponding to the sign language image information on the first wearable device 10;

[0061] The processing steps on the side of the second wearable device 20 are:

[0062] Step 121: Obtain ambient voice from the second wearable device 20;

[0063] Step 122: Determine the environmental text information and...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a deaf-mute sign language interaction method and a deaf-mute sign language interaction device. The method comprises steps: sign language image information is acquired from a first wearable device, and sign language features are extracted from the sign language image information; according to a sign language word mapping library and a sign language speech mapping library, the word information and the speech information corresponding to the sign language features are determined; the word information and / or the speech information corresponding to the sign language image information are / is displayed on the first wearable device; environment speech is acquired from a second wearable device, and environment word information and environment sign language information corresponding to the environment speech are determined according to a speech word mapping library and a sign language speech mapping library; and the environment word information and / or the environment signlanguage information are / is displayed on the second wearable device. The self learning-type method and the device can enable a deaf-mute to communicate with other person in a barrier-free and smoothmode.

Description

technical field [0001] The invention relates to the technical field of wearable devices for special groups, in particular to a sign language interaction method for deaf-mute people and a sign language interaction device for deaf-mute people. Background technique [0002] With the development and progress of society's science and technology, deaf-mute people need to communicate with normal people more efficiently to meet the needs of life and work. Normal people also need to better understand the meaning expressed in the process of communicating with deaf-mute people. [0003] Existing deaf-mute sign language interactive devices do not have high accuracy in recognizing sign language; [0004] With the rapid development of science and technology, the core algorithms and computing capabilities of chips have been sufficiently improved and breakthroughs, and the manufacturing capabilities of hardware have also been greatly developed. Image recognition, speech recognition, softwa...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G09B21/00
CPCG09B21/00
Inventor 魏尚利
Owner 深圳市漫牛医疗有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products