A static gesture recognition method combining depth information and skin color information

A gesture recognition and depth information technology, applied in the field of image recognition, can solve the problems of difficulty in meeting the real-time requirements of the gesture recognition system, single gesture features, and taking a long time to improve the recognition accuracy, strong robustness, The effect of removing arm interference

Inactive Publication Date: 2019-01-15
SOUTH CHINA UNIV OF TECH
View PDF3 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The gesture features extracted in the existing gesture recognition methods are relatively single, and the combination of multiple features can effectively improve the accuracy and robustness of gesture recognition methods
[0006] In order to achieve high classification accuracy, the classifiers selected by many gesture recognition methods take a long time and are trained on a large sample set to obtain
In addition, the speed in the classification process is slow, and it is difficult to meet the real-time requirements of the gesture recognition system

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A static gesture recognition method combining depth information and skin color information
  • A static gesture recognition method combining depth information and skin color information
  • A static gesture recognition method combining depth information and skin color information

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0054] like figure 1 As shown, a static gesture recognition method combining depth information and skin color information, the steps are: image acquisition step, hand segmentation step, arm removal step, feature extraction step and gesture recognition step.

[0055] Image acquisition steps:

[0056] The RGB image and its corresponding depth image are collected simultaneously by kinect, that is, the depth information corresponding to all pixels in the RGB image is obtained.

[0057] The kinect should be located 200mm to 3500mm in front of the human body, and the hand should be the closest to the kinect.

[0058] Hand segmentation steps:

[0059] like figure 2 As shown, the hand segmentation step includes the following steps:

[0060] (1) Combine the depth image and the RGB image, and use the depth threshold segmentation to obtain the RGB image containing the hand. The specific process is as follows:

[0061] Locate the pixel closest to the kinect in the depth image, and c...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a static gesture recognition method which combines depth information and skin color information. The method comprises the following steps: acquiring RGB image and depth image by kinect; using the depth threshold and human skin color information to get the binary image of the hand. Distance transform operation combined with palm cutting circle and threshold method is used tojudge whether there is arm region in hand image. The arm region is removed by exclusive OR operation between images to obtain binary gesture image. Calculating the number of Fourier descriptors and fingertips to form the eigenvector of the gesture; Support vector machine is used for gesture classification to achieve the purpose of gesture recognition. The invention realizes hand partial cutting by combining depth information and skin color information, and overcomes the influence of skin color region in complex background. By removing the arm region, the disturbance of the arm to the classification accuracy of the system can be overcome. The number of fingertips and Fourier descriptor features are computed and input to support vector machine for gesture recognition.

Description

technical field [0001] The invention relates to the technical field of image recognition, in particular to a static gesture recognition method combining depth information and skin color information. Background technique [0002] With the development of related technologies such as computer vision and pattern recognition, the research of human-computer interaction technology has been greatly promoted, and people can communicate with computers more naturally without being limited to keyboards and mice. Vision-based static gesture recognition, as a natural way of human-computer interaction, has gradually become a research hotspot in human-computer interaction technology, and has been widely used in smart home, robot control, sign language recognition and other fields. [0003] In static gesture recognition based on a monocular camera, it is difficult to separate the human hand from the background in the image. In the early gesture recognition systems, by wearing marked gloves ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06T7/136G06F3/01
CPCG06F3/017G06T7/136G06T2207/10024G06T2207/10028G06V40/113G06F18/2411G06F18/214
Inventor 周智恒许冰媛
Owner SOUTH CHINA UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products