Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Static gesture recognition method based on watershed transformation

A watershed transformation and gesture recognition technology, applied in the field of image processing and human-computer interaction, can solve the problems of error, over-segmentation, affecting the accuracy of gesture segmentation, and achieve the effect of good adaptability and high accuracy

Active Publication Date: 2021-06-04
HARBIN UNIV OF SCI & TECH +1
View PDF5 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

As a method of image segmentation, the watershed algorithm can segment gesture images, but the traditional watershed algorithm is prone to the problem of over-segmentation. Although Gaussian filtering can reduce the impact of over-segmentation, a single filter kernel will cause the loss of image edge information. , which ultimately affects the accuracy of gesture segmentation; the selection of gesture features directly affects the final recognition accuracy, and the traditional area-perimeter ratio and Euclidean distance as features for matching will cause certain errors

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Static gesture recognition method based on watershed transformation
  • Static gesture recognition method based on watershed transformation
  • Static gesture recognition method based on watershed transformation

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach 1

[0083] The present invention proposes a static gesture recognition method based on watershed transform, such as figure 1 As shown, the specific method is as follows:

[0084] Step 1, the CMOS image sensor collects the RGB image of the user's hand, denoted as M;

[0085] Step 2, the RGB image M of gathering is converted under the YCbCr color space, and the converted image is denoted as N;

[0086] Step 3, carry out illumination compensation to the image N under the YCbCr color space by the method for self-adaptive brightness adjustment, the image after compensation is denoted as O;

[0087] Step 4, extract the skin-like region of the image O by threshold segmentation, and denote the extracted image as U;

[0088] Step 5. Segment the image U by watershed transform, and denote the segmented image as P;

[0089] Step 6. Use two Gaussian filter kernels to perform Gaussian filtering on the image P after watershed segmentation, and the filtered image is denoted as Q;

[0090] Ste...

specific Embodiment approach 2

[0094] On the basis of the specific embodiment one, a static gesture recognition method based on watershed transformation, in the first step, the CMOS image sensor collects the user's hand image, and requires the center of the palm or the center of the back of the subject to face the camera, so that A complete hand image that can be collected.

specific Embodiment approach 3

[0096] On the basis of the specific embodiment one, a static gesture recognition method based on watershed transformation, in the second step, the RGB image collected is converted to the YCbCr color space, and the good clustering characteristics of the skin color in the YCbCr color space are utilized , which can better segment skin-like areas, and perform color space conversion according to formula (1):

[0097]

[0098] where Y represents brightness, and Cb and Cr represent the concentration offset components of blue and red.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a static gesture recognition method based on watershed transformation, and belongs to the field of image processing. The gesture recognition method comprises the steps of gesture image collection, color space conversion, adaptive brightness adjustment, skin color threshold segmentation, watershed transformation, gray threshold combination, gesture feature extraction and template matching. The self-adaptive brightness adjustment algorithm greatly improves the accuracy of skin color-like region extraction; the problem of over-segmentation in watershed transformation is better solved through the double-Gaussian filtering kernel, and meanwhile edge information of the image is better reserved; the fourier correlation discrimination gesture instruction better utilizes the characteristics of the to-be-matched gesture, and the accuracy of gesture recognition is improved.

Description

technical field [0001] The invention relates to the fields of image processing and human-computer interaction, in particular to a static gesture recognition method based on watershed transformation. Background technique [0002] Gesture recognition technology, as an application model of natural human-computer interaction, recognizes the user's operation instructions through technical solutions such as sensors, radar, and video images, and its application scope is gradually entering various fields of social life. Among them, gesture interaction based on video images has better application and development prospects. For example, in the application of robot control and remote control, in some special occasions where it is inconvenient to directly control such as dangerous areas; to assist the life of the deaf-mute, and improve the quality of life of the deaf-mute through gesture communication; in the field of smart home control, it can give users Bring a better interactive exp...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/34G06K9/46
CPCG06V40/113G06V10/267G06V10/56
Inventor 于天河张海珍王鹏季盛李翰堂秦梦娇
Owner HARBIN UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products