Gesture detection method and system based on space-time sequence diagram

A gesture detection and sequence diagram technology, applied in the field of computer vision recognition, can solve the problems of affecting the detection and recognition effect, poor effect, ignoring the connection between two hands, etc., and achieve the effect of good clustering effect

Pending Publication Date: 2021-09-03
SHANDONG NORMAL UNIV
View PDF1 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

At present, there are few abnormal gesture detection models, and the effect of the existing models on abnormal gesture detection is relatively poor, ignoring the connection between the two hands, which affects the final detection and recognition effect to a certain extent

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Gesture detection method and system based on space-time sequence diagram
  • Gesture detection method and system based on space-time sequence diagram
  • Gesture detection method and system based on space-time sequence diagram

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0042] Such as figure 1 As shown, Embodiment 1 of the present invention provides a gesture detection system based on a spatio-temporal sequence graph, the system includes:

[0043] Building blocks for constructing a spatiotemporal sequence graph of hand joints;

[0044] An extraction module is used to extract the feature relationship between each joint point and its adjacent joint points;

[0045] The first encoding module is used to perform a position encoding operation on the feature relationship to obtain a position encoding vector;

[0046] The second encoding module is used to combine the position encoding vector and the feature relationship, and encode to obtain the action vector;

[0047] The third encoding module is used to perform time-series encoding on the action vector to obtain the space-time relationship vector between the joint point and other joint points;

[0048] The clustering module is used for performing cluster analysis on the action vector and the spa...

Embodiment 2

[0074] Embodiment 2 of the present invention provides a non-transitory computer-readable storage medium, the non-transitory computer-readable storage medium includes instructions for executing a gesture detection method based on a time-space sequence diagram, the method includes:

[0075] Construct the time-space sequence diagram of the joint points of the hand, extract the feature relationship between each joint point and its adjacent joint points; perform position encoding operation on the feature relationship, and obtain the position encoding vector;

[0076] Combining the position encoding vector and the feature relationship, the encoding is obtained to obtain the action vector; the time sequence encoding is performed on the action vector to obtain the space-time relationship vector between the joint point and other joint points;

[0077] The action vector and the space-time relationship vector are clustered and analyzed to realize the classification and recognition of gest...

Embodiment 3

[0079] Embodiment 3 of the present invention provides an electronic device, which includes a non-transitory computer-readable storage medium; and one or more processors capable of executing the instructions of the non-transitory computer-readable storage medium . The non-transitory computer-readable storage medium includes instructions for performing a gesture detection method based on a spatio-temporal sequence graph, the method comprising:

[0080] Construct the time-space sequence diagram of the joint points of the hand, extract the feature relationship between each joint point and its adjacent joint points; perform position encoding operation on the feature relationship, and obtain the position encoding vector;

[0081] Combining the position encoding vector and the feature relationship, the encoding is obtained to obtain the action vector; the time sequence encoding is performed on the action vector to obtain the space-time relationship vector between the joint point and ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a gesture detection method based on a space-time sequence diagram, which belongs to the technical field of computer vision, and comprises the following steps: constructing the space-time sequence diagram of hand articulation points, and extracting a feature relationship between each articulation point and an adjacent articulation point; performing position coding operation on the feature relationship to obtain a position coding vector; coding by combining the position coding vector and the feature relationship to obtain an action vector; performing time sequence coding on the action vectors to obtain time-space relation vectors of the articulation points and other articulation points; and carrying out clustering analysis on the action vector and the space-time relation vector to realize classified recognition of gestures. Hand key points are extracted from input video frames according to unnecessary parameters such as viewpoints and illumination, and abnormal behaviors in gesture actions are effectively detected; and the model is trained in a semi-supervised mode, behaviors are clustered and judged in a clustering mode, a good clustering effect is obtained, and recognition and anomaly detection of gesture actions in different environments are achieved.

Description

technical field [0001] The invention relates to the technical field of computer vision recognition, in particular to a gesture detection method and system based on a time-space sequence graph. Background technique [0002] Natural interaction technology has become a hot topic in the field of human-computer interaction, and gesture is a natural and intuitive way of human-computer interaction. Therefore, gesture interaction technology based on visual tracking has attracted the attention of many researchers. However, due to problems such as different palm shapes, complex and changeable gesture movements, and easy to be disturbed by the external environment, there are still great challenges in effective abnormal gesture detection. [0003] Video-based abnormal behavior detection methods can be divided into three categories: unsupervised, semi-supervised, and supervised. Unsupervised methods do not require label information and assume that abnormal behavior is rare and irregular...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06K9/46G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06N3/045G06F18/23G06F18/214
Inventor 刘一良亓延鹏代丽吕蕾
Owner SHANDONG NORMAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products