Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Motion capture-based virtual reality sign language learning, testing and evaluating method

A technology of motion capture and virtual reality, applied in the input/output process of data processing, input/output of user/computer interaction, image data processing, etc. Capture technology and other issues to achieve accurate sign language movements, convenient production, and cost-saving effects

Pending Publication Date: 2019-11-05
QUANZHOU NORMAL UNIV
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] Existing sign language teaching is mainly used to help the deaf to communicate, without the use of motion capture technology, and there is no related functions of sign language learning, testing and sign language movement evaluation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Motion capture-based virtual reality sign language learning, testing and evaluating method
  • Motion capture-based virtual reality sign language learning, testing and evaluating method
  • Motion capture-based virtual reality sign language learning, testing and evaluating method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032] like Figure 1-4 As shown, the virtual reality sign language learning, testing and evaluation method based on motion capture, the method uses a motion comparison device to process sign language conversation scenes into data; the motion comparison device includes inertial motion capture equipment, data collectors and Data comparison device; the inertial motion capture device includes a plurality of data sensors; each data sensor is respectively bound and affixed to the body parts of the conversation participants; when the sign language conversation scene is carried out, the data sensors record the body parts of the conversation participants Spatial movement information forms conversational scene data; and then uses the conversational scene data to build virtual reality scenes through the 3D scene development platform; sign language students learn sign language through virtual reality scenes.

[0033] The conversation scene can be a conversation scene involving more than ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a motion capture-based virtual reality sign language learning, testing and evaluating method. According to the method, a motion comparison device is adopted to carry out datamation processing on a sign language session scene. The action comparison device comprises inertia action capture equipment, a data collector and a data comparison device. The inertial motion capture equipment comprises a plurality of data sensors. Each data sensor is bound and attached to a human body part of a session participant. When the sign language conversation scene is carried out, the data sensor records human body space movement information of the conversation participant to form conversation scene data and establishes a virtual reality scene by using the session scene data through a three-dimensional scene development platform. A sign language learner learns the sign language through the virtual reality scene. According to the invention, the action capture technology can be used torecord the sign language session scene and form the virtual scene. Therefore, the related functions of sign language learning, testing and sign language action evaluation are carried out by using thevirtual scene.

Description

technical field [0001] The invention relates to the technical field of teaching facilities, in particular to a virtual reality sign language learning, testing and evaluation method based on motion capture. Background technique [0002] Existing sign language teaching is mainly used to help the deaf to communicate, without using motion capture technology, and there is no related functions of sign language learning, testing and sign language movement evaluation. [0003] If language teaching can be carried out in conjunction with dialogue scenes, the quality of teaching will be effectively improved. Contents of the invention [0004] The present invention proposes a virtual reality sign language learning, testing and evaluation method based on motion capture, which can use motion capture technology to record sign language conversation scenes and form a virtual scene, thereby using the virtual scene to perform related functions of sign language learning, testing and sign lang...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06T19/00G06Q50/20
CPCG06T19/006G06F3/012G06Q50/205Y02D10/00
Inventor 王鸿伟徐剑杰焦莹赖雪平杨锦龙黄梅郑童冰陈梓荣王荣海吴伊萍曾蔚
Owner QUANZHOU NORMAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products