Dynamic gesture recognition method and system based on two-dimensional convolutional network

A two-dimensional convolution and dynamic gesture technology, which is applied in neural learning methods, character and pattern recognition, biological neural network models, etc., can solve the problems of increasing the difficulty of recognition tasks, reduce feature redundancy, and reduce the amount of calculation , The effect of improving the accuracy of recognition

Active Publication Date: 2019-07-02
SHANDONG UNIV
View PDF7 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

At the same time, traditional methods such as hidden Markov models need to manually design several feature descriptors, which undoubtedly increases the difficulty of the recognition task.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Dynamic gesture recognition method and system based on two-dimensional convolutional network
  • Dynamic gesture recognition method and system based on two-dimensional convolutional network
  • Dynamic gesture recognition method and system based on two-dimensional convolutional network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0076] like figure 1 and image 3 As shown, set the input as a video sequence W

[0077] S1 frame sampling

[0078] Due to the continuity of the video, the difference between several adjacent frames is small. If the video sequence is not frame-sampled, the resulting action feature redundancy will be too high, which will increase the amount of calculation and reduce the recognition efficiency. precision.

[0079] For the input video sequence W, we equally divide it into K segments: {S 1 , S 2 , S 3 ,...,S K}. These K video segments have images with the same number of frames. Then we for each video segment S k , k=1, 2,..., K extracts a frame of image from it in a certain way, denoted as T k , note that S k and T k It is one-to-one correspondence. Through frame sampling, we will sample the image sequence {T 1 ,T 2 ,T 3 ,...,T K} to represent the original video V. In this way, the amount of calculation is greatly reduced, and at the same time, the ability to mod...

Embodiment 2

[0112] Embodiment 2: as Figure 4 as shown,

[0113] A dynamic gesture recognition system based on two-dimensional convolutional network, including:

[0114] The frame sampling module collects the actual dynamic gesture video, and processes the video by frame; performs frame sampling on the actual image after frame division;

[0115] An image encoding module, which encodes the actual image after the frame sampling to obtain the actual feature vector of the actual image;

[0116] The feature vector fusion module is used to fuse the actual feature vectors to obtain the actual feature matrix;

[0117] The gesture recognition module inputs the actual feature matrix into the trained two-dimensional convolutional neural network and outputs gesture recognition results.

[0118] Therefore, in the embodiment of the application, the source video stream is processed into a frame of image and sent to the two-dimensional convolutional network to obtain the classification result of the g...

Embodiment 3

[0120] The present disclosure also provides an electronic device, including a memory, a processor, and computer instructions stored in the memory and executed on the processor. When the computer instructions are executed by the processor, each operation in the method is completed. For brevity, I won't repeat them here.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a dynamic gesture recognition method and system based on a two-dimensional convolutional network, and the method comprises the steps: collecting an actual dynamic gesture video, and carrying out the framing processing of the video; carrying out frame sampling on the actual image after framing; encoding the actual image after frame sampling to obtain an actual feature vectorof the actual image; fusing the actual feature vectors to obtain an actual feature matrix; and inputting the actual feature matrix into the trained two-dimensional convolutional neural network, and outputting a gesture recognition result. By processing a source video stream into a frame of image, and sending the frame of image into a two-dimensional convolutional network, a classification resultof a gesture action is obtained; wherein the image generated by the video contains the spatial feature information and the time sequence information of the video. According to the method, the calculation complexity of gesture recognition is effectively reduced.

Description

technical field [0001] The present disclosure relates to a dynamic gesture recognition method and system based on a two-dimensional convolutional network. Background technique [0002] The statements in this section merely mention background art related to the present disclosure and do not necessarily constitute prior art. [0003] Gesture can be said to be another important communication tool for human beings besides language. It contains rich semantic information and has a wide range of applications, such as human-computer interaction, augmented reality, emotional computing, sign language recognition and other fields. The initial gesture recognition mainly uses wearable devices to directly detect the angle and spatial position of the joints of the hand and arm. Most of these devices connect the computer system and the user through wired technology, so that the user's gesture information can be transmitted to the recognition system completely and without error. Typical dev...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V40/28G06V20/41G06V20/46G06N3/045G06F18/214
Inventor 杨明强刘玉鹏王德强李杰程琦
Owner SHANDONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products