Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Real-time gesture recognition method and system

A gesture recognition and gesture technology, applied in the field of human-computer interaction, can solve the problems that the accuracy of gesture recognition is not very high, the effect of similar gesture recognition is not very good, and gesture recognition takes a long time to process, so as to improve the accuracy and speed , the effect of improving robustness

Active Publication Date: 2020-12-29
CHONGQING UNIV
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The main problems of this method are: (1) the accuracy of gesture recognition is not very high, especially the recognition effect of similar gestures is not very good, and it is easily affected by noise gestures; (2) the gesture amplitude and strength of different people different, resulting in gesture recognition results that vary from person to person; (3) the real-time nature of gesture recognition is still difficult to meet people's needs for real-time interaction, and gesture recognition takes a long time to process

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Real-time gesture recognition method and system
  • Real-time gesture recognition method and system
  • Real-time gesture recognition method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0053] see Figure 1 to Figure 3 , a real-time gesture recognition method mainly includes the following steps:

[0054] 1) Establish a gesture classification model and store it on the server side.

[0055] The main steps of building a gesture classification model are as follows:

[0056] 1.1) Obtain training sample data, the main steps are as follows:

[0057] 1.1.1) Use the motion sensor of the smart terminal to collect the motion sensor data of n testers, which is recorded as data set B=[B 1 , B 2 ,...,B h ,...,B m ]. B h Represents a set of motion sensor data.

[0058] The motion sensor mainly includes a three-axis acceleration sensor and a three-axis gyro sensor. The motion sensing data mainly includes three-axis acceleration data, three-axis gyroscope data and motion time.

[0059] 1.1.2) The smart terminal sends the collected motion sensor data to the server via Bluetooth. The server side is a mobile phone or a computer with data storage and processing functio...

Embodiment 2

[0103] The system based on real-time gesture recognition method mainly includes smart terminal, camera and server.

[0104] The smart terminal has a motion sensor.

[0105] The smart terminal mainly includes smart bracelets, smart watches and smart gloves.

[0106] The motion sensor collects the motion sensor data of the user in real time and uploads it to the server.

[0107] When the camera collects gesture sensing data, it records the video information collected by the gesture and uploads it to the server.

[0108] Gesture classification models are stored on the server side.

[0109] The server tags the gesture data against the video information and the motion sensor data.

[0110] The server side preprocesses the motion sensor data to obtain the resultant acceleration data a.

[0111] The resultant acceleration a is as follows:

[0112]

[0113] In the formula, x, y and z are the data of the three axes of the three-axis acceleration sensor, respectively.

[0114] ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a real-time gesture recognition method and system. The main steps of the method are: 1) Establishing a gesture classification model. 2) Collect motion sensor data of gestures in real time from the smart terminal. 3) The server side preprocesses the motion sensor data to obtain the combined acceleration data. 4) Gesture segmentation is performed on the preprocessed combined acceleration data, segmented into gesture segments in real time, and feature data of gestures are extracted. 5) Input the characteristic data of the gesture into the gesture classification model, recognize the gesture in real time, and obtain the gesture recognition result. The system mainly includes intelligent terminals and servers. The gesture segmentation method based on the self-adaptive sliding window proposed by the present invention can quickly and accurately divide continuous gestures into individual effective gestures, thereby improving the accuracy and speed of gesture recognition.

Description

technical field [0001] The invention relates to the field of human-computer interaction, in particular to a real-time gesture recognition method and system. Background technique [0002] In modern people's life, they interact with various electronic devices all the time. However, the current interaction method still has certain disadvantages, and cannot fully meet people's increasing demands for interactive experience. The initial human-computer interaction requires special hardware control devices such as remote controllers, mice, and keyboards to be controlled. Without them, interaction cannot be performed, which brings great limitations to human-computer interaction. Controlling various devices by sending commands by voice has risen in recent years, but it also encounters the difficulty of dialect recognition. [0003] Interacting with machines by recognizing gestures is a research hotspot. At present, the most popular method is gesture recognition through image informa...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/28G06F18/2431
Inventor 李艳德刘礼王泰乾
Owner CHONGQING UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products