Three-dimensional gesture recognition method based on depth image and interaction system

A technology of three-dimensional gestures and recognition methods, which is applied in character and pattern recognition, input/output of user/computer interaction, graphic reading, etc., and can solve the problems of limiting the degree of freedom and generalization, complexity, accuracy, and complexity of gesture recognition , to achieve the effect of enriching uses, overcoming limitations, and high recognition accuracy

Active Publication Date: 2018-11-09
SOUTH CHINA UNIV OF TECH
View PDF2 Cites 19 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, because of the high degree of freedom and self-occlusion of the hand, three-dimensional gesture recognition is still a challenging problem. Therefore, many gesture recognitions are more about tracking the trajectory of the hand to achieve very simple human-computer interaction. Another thing is to reserve multiple gesture templates, and then match the predefined templates according to the characteristic information obtained by the sensor to judge gestures, which greatly limits the degree of freedom and generalization of gesture recognition.
[0003] At present, there are three main methods of gesture recognition: first, gesture recognition based on traditional graphics algorithms, through various complex graphics algorithms, to identify the key points of the hand, this kind of method is not only complicated but also not high in accuracy; second , hand gesture recognition based on the hand model, the 3D model of the hand is predefined first, and then the transformed 3D model is matched with the image result. This method is not only complicated, but also needs to be compared with the user's hand image in advance Adaptation; third, based on data-driven gesture recognition, use labeled data to train a deep network, input images into the trained network, and automatically detect gestures

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Three-dimensional gesture recognition method based on depth image and interaction system
  • Three-dimensional gesture recognition method based on depth image and interaction system
  • Three-dimensional gesture recognition method based on depth image and interaction system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0033] In order to solve the problems of low accuracy of gesture recognition, diversity of template matching methods, limited method recognition results and inability to be modified by the user, the present invention is based on the third method, that is, training a deep neural network for gesture recognition, first through deep convolutional neural network detection The position of the hand, and then get the final three-dimensional coordinates of each joint point of the palm through ResNet and autoencoder, and finally restore the entire palm through the gesture interaction system and perform corresponding interactive actions, so as to achieve better three-dimensional gesture recognition and interaction effect .

[0034] Such as figure 1 As shown, the gesture recognition method and its interaction system based on the depth information map provided in this embodiment include the following steps:

[0035] 1) Use the depth sensor to obtain the depth information map;

[0036] Th...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a three-dimensional gesture recognition method based on a depth image and an interaction system. The method comprises steps: a depth information map is obtained by using a depth sensor; an AOI is recognized for the obtained depth information map by using a CNN, and a to-be-recognized hand depth information map is cut; the hand depth information map acquired from the CNN isinputted to a gesture recognition network for recognition, and three-dimensional coordinates of a recognized hand key node are acquired; the hand key node coordinates are subjected to coordinate transformation, which is adapted to world coordinates in the interaction system; by using Socket communication, a hand key node coordinate sequence after coordinate transformation is transmitted to the interaction system; and the interaction system acquires the hand key node information, a semantic judgment condition set by a user is combined, and a physical engine is used to display an interaction result. The three-dimensional coordinates of a hand joint point can be recognized excellently, a gesture semantic action can also be judged, and the application prospect is wide.

Description

technical field [0001] The invention relates to the technical fields of computer graphics, deep learning and human-computer interaction, in particular to a three-dimensional gesture recognition method based on depth images and deep learning and an interactive system thereof. Background technique [0002] Natural human-computer interaction has always been an important research direction in the field of computer graphics and computer human-computer interaction. As an important component of natural human-computer interaction, 3D gesture recognition has naturally received great attention. At the same time, in recent years, some relatively mature depth cameras have been launched one after another, such as Microsoft’s Kinect. The depth images obtained by depth cameras avoid the shortcomings of traditional RGB images that are easily affected by lighting and complex backgrounds, and bring convenience to gesture recognition. What needs to be achieved in 3D gesture recognition should ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06F3/01
CPCG06F3/017G06V40/117G06V40/113G06F18/214
Inventor 彭昊李拥军冼楚华吴煜林冯嘉昌
Owner SOUTH CHINA UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products