Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Gesture control man-machine interactive system based on WiFi

A human-computer interaction and gesture control technology, applied in the field of gesture-controlled human-computer interaction systems, can solve the problems of inconvenient use, unsatisfactory effects, and inability to be used by deaf-mute users, and achieve the effect of low cost and simple method.

Inactive Publication Date: 2016-07-27
SUZHOU INST FOR ADVANCED STUDY USTC
View PDF3 Cites 32 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Sensor-based human-computer interaction technology requires users to wear sensor devices all the time, which is inconvenient for actual use
Another similar product is a motion control system based on speech recognition, but this system is not ideal in noisy environments, and the deaf users are basically unable to use such products

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Gesture control man-machine interactive system based on WiFi
  • Gesture control man-machine interactive system based on WiFi
  • Gesture control man-machine interactive system based on WiFi

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0047] A gesture control human-computer interaction system based on WiFi, its system architecture diagram is as follows image 3 As shown, the system processing flow chart is as follows Figure 4 As shown, where: after the signal acquisition module obtains the channel state information (CSI) of the physical layer in the wireless network environment, it hands over to the signal processing module to perform signal processing operations such as denoising, filtering, and smoothing on the original signal, and the processed CSI data Enter the action extraction module. The gesture action extraction module uses an appropriate segmentation algorithm to extract target user action segments contained in it according to the characteristics of the CSI waveform. The target user's action fragments extracted by the action extraction module are passed through the action instruction mapping module, and the classification algorithm is used to classify and identify the target user's actions, a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a gesture control man-machine interactive system based on WiFi.The gesture control man-machine interactive system comprises a signal collecting module, a signal processing module, a gesture action extraction module and an action command mapping module, wherein the signal collecting module obtains channel state information (CSI) of a physical layer containing user' action from WiFi signals; the signal processing module processes collected signals, and the processing comprises donoising, filtering and smoothing operation; the gesture action extraction module performs gesture action information extraction according to signals obtained from the signal processing module; the action command mapping module is used for obtaining gesture action data from the gesture action extraction module and mapping the data to be corresponding computer operation instructions.By means of the WiFi signals, users do not need to wear any specially-made equipment; through the collection of the channel state information of the physical layer, the extraction and identification of gesture information contained in the channel state information and the mapping to be the corresponding computer operation instructions, human interaction is performed.The cost is low, and the system has low coupling performance and scalability.

Description

technical field [0001] The invention relates to a gesture control human-computer interaction system, in particular to a WiFi-based gesture control human-computer interaction system. Background technique [0002] Most of the current mainstream human-computer interaction technologies are based on sensors and cameras. For example, Microsoft's Xbox Kinect and Leap's LeapMotion are human-computer interaction technologies based on camera recognition of human body movements. The Institute of Computing Technology, Chinese Academy of Sciences has successfully developed a Chinese sign language recognition and synthesis system based on multifunctional perception. It uses data gloves based on multiple sensors to recognize sign language words with a large vocabulary (5177). Both of these two mainstream human-computer interaction technologies require additional special equipment, such as cameras and multi-functional sensors, and the equipment costs are relatively high. [0003] Camera-ba...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06K9/00
CPCG06F3/017G06V40/113G06F2218/02G06F2218/04
Inventor 杨威黎宏黄刘生王建新许杨
Owner SUZHOU INST FOR ADVANCED STUDY USTC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products