Multi-mode fusion gesture keyboard input method, device and system and storage medium

A gesture keyboard and input method technology, applied in the fields of computer vision, gesture recognition, and human-computer interaction, can solve problems such as troublesome use, large space occupation, and inconvenient portability

Pending Publication Date: 2020-09-29
TIANJIN UNIV
View PDF2 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Although physical keyboard input is the most commonly used input method, some complex environments have certain restrictions on the keyboard. For example, when working outdoors, an external keyboard is carried, but there is no fixed support surface, which is troublesome to use; moreover, it takes up Large space, very inconvenient to carry

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-mode fusion gesture keyboard input method, device and system and storage medium
  • Multi-mode fusion gesture keyboard input method, device and system and storage medium
  • Multi-mode fusion gesture keyboard input method, device and system and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0035] The embodiment of the present invention proposes a multi-modal fusion gesture keyboard input method, see figure 1 and figure 2 , the method includes the following steps:

[0036] Obtain the IMU sensor data, EMG sensor data, and bending sensor data of the user's keys by wearing data gloves;

[0037] The IMU sensor data, EMG sensor data, and bending sensor data are filtered and denoised by the preprocessing module and then sent to the head-mounted device through the Bluetooth module for processing; The preprocessed IMU sensor data, EMG sensor data, deformation data (ie bending sensor data), and hand image data are put into the corresponding classifier for feature extraction, and then decision fusion is performed to identify the corresponding gesture key input signal.

Embodiment 2

[0039] A multimodal fusion gesture keyboard input device, comprising: a data glove,

[0040] like image 3 shown, wherein the data glove includes: an IMU sensor module, an electromyography sensor module, a bending sensor module, a first preprocessing module, a first Bluetooth module, and a wireless charging module;

[0041] The IMU sensor module is a six-axis inertial measurement unit motion sensor, which is used to record the gestures of both hands and the motion information when making key actions. It includes a three-axis accelerometer to record acceleration information and a three-axis gyroscope (x, y, z axis) to record the angular velocity information, there are five sensors in total, which are located at the back of the five fingers respectively, and are connected with the micro-control unit at the back of the hand through the flexible circuit board;

[0042] The EMG sensor module is composed of six muscle pulse detection modules surrounded by a metal contact, which is ...

Embodiment 3

[0052] A multi-modal fusion gesture keyboard input device, comprising: a head-mounted device,

[0053] like Figure 5 As shown, the head-mounted device includes: a binocular camera module, a second preprocessing module, a feature extraction module, a decision fusion module, a second Bluetooth module, a display module, and a power supply module;

[0054] The binocular camera module is located under the head-mounted device, and is used to obtain the gesture images of the finger movement of the user wearing the data glove. A 50-frame binocular camera is used to record the image information of the keys of both hands in multiple frames:

[0055] The second preprocessing module is used to perform noise reduction processing on the collected gesture images;

[0056] The feature extraction module is used to perform neural network classification training on the inertial data information, myoelectric data information, and bending deformation data information collected by the data glove,...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a multi-mode fusion gesture keyboard input method, device and system and a storage medium. The method comprises the steps: acquiring IMU sensor data, myoelectricity sensor dataand bending sensor data of user keys; shooting the hand area of the user to obtain hand image data; and inputting the four pieces of preprocessed data into corresponding classifiers for feature extraction, then performing decision fusion, and identifying the four pieces of preprocessed data as corresponding gesture key input signals. The device comprises a data glove or a head-mounted device; thesystem comprises a data glove and a head-mounted device. The storage medium comprises a processor and a memory. Interaction between gestures and virtual keyboard characters is achieved in the head-mounted device, character images are displayed in real time, and an interactive interface with more three-dimensional scenes, richer information and more natural and friendly environment is presented for keyboard input of people.

Description

technical field [0001] The invention relates to the fields of gesture recognition, computer vision, and human-computer interaction, and in particular, to a multimodal fusion gesture keyboard input method, device, system and storage medium. Background technique [0002] Since the birth of the computer, it is inevitable to input various information, such as operation instructions and data information. The most basic function of input devices is to convert various forms of information into forms suitable for computer processing. The keyboard is a commonly used input device. It consists of a set of switch matrices, including: numeric keys, letter keys, symbol keys, function keys, and control keys. Each key has its unique code in the computer. When a key is pressed, the keyboard interface sends the binary code of the key into the computer host, and displays the key character on the display. The keyboard interface circuit mostly uses a single-chip microprocessor, which controls...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06K9/00G06K9/62G06N3/08G06N3/04
CPCG06F3/014G06F3/017G06F3/015G06N3/08G06V40/28G06N3/045G06F2218/04G06F2218/12G06F18/214
Inventor 刘璇恒陶文源闫野邓宝松马权智赵涛印二威谢良
Owner TIANJIN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products