Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Gesture recognition method, device, intelligent equipment and computer readable storage medium

A gesture recognition and gesture technology, applied in the field of human-computer interaction, can solve the problems of narrowing differences, lack of adaptability to spatial scale changes, etc., and achieve the effect of narrowing differences and accurate gesture recognition

Pending Publication Date: 2019-10-18
ZTE CORP +1
View PDF0 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In view of this, the purpose of the embodiments of the present invention is to provide a gesture recognition method, device, smart device, and computer-readable storage medium to solve the technical problem that the DTW algorithm lacks adaptability to spatial scale changes, thereby reducing the Variation due to change or change in force

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Gesture recognition method, device, intelligent equipment and computer readable storage medium
  • Gesture recognition method, device, intelligent equipment and computer readable storage medium
  • Gesture recognition method, device, intelligent equipment and computer readable storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0030] Such as figure 1 As shown, a gesture recognition method provided by an embodiment of the present invention includes:

[0031] S101. Acquire acceleration and angular velocity data of a gesture to be recognized, and perform filtering processing.

[0032] Specifically, acceleration and angular velocity data are generally collected through an acceleration sensor and a gyroscope configured in a smart device, and the acceleration and angular velocity data are used to describe gesture behavior information. However, non-target gesture signals will be mixed in the acquisition process, and filtering processing is required to avoid its influence. There are many filtering methods. Experiments and research show that the frequency of gesture behavior is generally below 3.5HZ, so it is better to use a low-pass filter with a cutoff frequency of 3.5HZ to filter acceleration and angular velocity data.

[0033] S102. Calculate the time series of acceleration resultant force according to...

Embodiment 2

[0060] Such as figure 2 As shown, the gesture template matching method provided by the embodiment of the present invention includes:

[0061] S1051. Compare the binned acceleration resultant force time series and angular velocity time series with the target gesture data of each category in the preset gesture template library, and select the category to which the target gesture template with the smallest difference belongs to as the target set.

[0062] Wherein, the preset gesture template library is pre-stored in a two-layer classified template library on the smart device. In order to ensure the recognition accuracy, a large number of samples are usually generated for each gesture in advance, after the processing of steps S101 to S104 in the first embodiment above, and then use the DTW algorithm to obtain samples with the smallest average Euclidean distance from other similar gestures and store them in the template library as templates . Multiple gesture templates can be ge...

Embodiment 3

[0068] Such as image 3 As shown, a gesture recognition device provided by an embodiment of the present invention includes a gesture acquisition module 10, a filter module 20, a resultant force sequence calculation module 30, a dimension unification module 40, a binning module 50, and a template matching module 60, wherein:

[0069] The gesture collection module 10 is used to acquire the acceleration and angular velocity data of the gesture to be recognized.

[0070] The filtering module 20 is used for filtering the acceleration and angular velocity data.

[0071] The resultant force sequence calculation module 30 is configured to calculate the acceleration resultant force time series according to the acceleration data.

[0072] The dimensional unification module 40 is used for unifying the dimensions of the acceleration resultant force time series and the angular velocity time series.

[0073] The binning module 50 is configured to bin the acceleration resultant force time ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention discloses a gesture recognition method, intelligent equipment and a storage medium, and belongs to the technical field of human-computer interaction. The method comprises the steps of obtaining acceleration and angular velocity data of a to-be-recognized gesture, and performing filtering processing; calculating an acceleration resultant force time sequence accordingto the acceleration data; performing dimension unified processing on the acceleration resultant force time sequence and the angular velocity time sequence; according to a preset rule, binning the acceleration resultant force time sequence and the angular velocity time sequence after the dimension unified processing; and matching the acceleration resultant force and the angular velocity time sequence after binning with a pre-stored gesture template library, and determining the type of the gesture to be recognized. According to the embodiment of the invention, under the condition that the costof the sensor is not increased, the difference of the gesture data in space can be effectively regulated, and the difference caused by speed change or force change is reduced, so that efficient and accurate gesture recognition is realized. The gesture recognition method and the gesture recognition device can be widely applied to intelligent equipment with high user interaction experience.

Description

technical field [0001] The present invention relates to the technical field of human-computer interaction, and in particular to an IMU (Inertial measurement unit, inertial measurement unit)-based gesture recognition method, device, smart device, and computer-readable storage medium. Background technique [0002] Gesture interaction is a natural and convenient means of human-computer interaction. It is simple, intuitive, easy to learn and use, and can express people's wishes intuitively. The popularity of more and more smart devices brings new uses and directions for gesture recognition. Various high-precision, low-power sensing devices equipped on smart devices also make gesture recognition more diverse. At present, commonly used gesture recognition includes photoelectric sensing devices, infrared detection devices, IMUs, and acceleration sensors. Among them, the IMU is a device for measuring the three-axis attitude angle (or angular rate) and acceleration of an object. It...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06F3/01
CPCG06F3/017G06V40/20
Inventor 王海鹏蔡亚菲龚岩刘武李泽
Owner ZTE CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products