Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Visual sense based static gesture identification method

A gesture recognition and gesture technology, applied in the field of human-computer interaction, can solve the problems of high gesture segmentation requirements, unsatisfactory classification results, high computational complexity of support vector machines, etc., and achieve the effect of improving recognition rate and classification effect

Inactive Publication Date: 2016-07-13
SHANDONG UNIV
View PDF3 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, a single feature often cannot describe the image well, so the classification results are not satisfactory.
[0004] In March 2013, Zhang Hanling, Li Hongying, and Zhou Min published the article "Gesture Recognition Fusion of Multi-features and Compressed Sensing" in the Journal of Hunan University (Natural Science Edition). This article proposed a Zernike moment and The HOG feature is a gesture recognition method that uses the CS (CompressiveSensing, Compressed Sensing) algorithm for classification. However, neither of these two features can well describe the local texture features of the gesture, and they are given when fusing these two features. The same weight, which does not reflect which feature has a greater impact on the recognition result
However, there are the following defects in this patent: the fused multi-features include the Hu moment feature, the defect feature and the six proportional features are essentially a feature that is the shape feature of the gesture, and the extraction of the shape feature requires the segmentation of the gesture during preprocessing Relatively high; in this patent, support vector machines are used for training and recognition, and the computational complexity of support vector machines is relatively high

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual sense based static gesture identification method
  • Visual sense based static gesture identification method
  • Visual sense based static gesture identification method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0088] This embodiment is carried out on the JochenTriesh gesture database, which has a total of 720 images, including 10 gestures, and 24 different people were photographed under 3 different backgrounds, including a single bright background, a single dark background and complex background. In this embodiment, only a total of 480 images under a single bright background and a single dark background are used, and each image is cropped to a size of 80×80 pixels.

[0089] A vision-based static gesture recognition method, the specific steps comprising:

[0090] A. Training phase

[0091] (1) Collect training gesture image samples;

[0092] (2) Preprocessing the training gesture images collected in step (1);

[0093] (3) the LBP feature of the training gesture grayscale image after extraction step (2) preprocessing;

[0094] (4) extracting the CSS corner feature of the training gesture outline image after step (2) preprocessing;

[0095] (5) the LBP feature of the training gest...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a visual sense based static gesture identification method. The method comprises a training stage and a testing stage. In the training stage, a training image is preprocessed, an LBP feature and a CSS angular point feature are extracted from the training image, the extracted features are fused, and a classifier designed on the basis of compressive sensing theories is trained. In the testing stage, a shot gesture image is preprocessed, an LBP feature and a CSS angular point feature are extracted from the test image, the two features are fused, and the trained classifier is used for classified identification. According to the invention, two features are fused, the classifier is designed via the compressive sensing theories, disadvantages of single feature can be overcome, and the gesture identification rate is improved.

Description

technical field [0001] The invention relates to a vision-based static gesture recognition method, which belongs to the technical field of human-computer interaction. Background technique [0002] With the rapid development of computer technology, human-computer interaction technology has become a research hotspot for researchers from all over the world. The traditional way of human-computer interaction is mainly based on keyboard, mouse and other equipment, which is very inconvenient and cannot meet people's needs more and more. Gesture plays a vital role in the field of human-computer interaction because of its vivid, vivid, and intuitive characteristics. Gesture recognition is generally divided into data glove-based gesture recognition and vision-based gesture recognition. Gesture recognition based on data gloves generally wears some sensing devices on the human hand, and performs signal analysis on physical quantities such as the speed and acceleration of the human hand...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/46G06K9/62
CPCG06V40/113G06V10/44G06F18/241G06F18/214
Inventor 杨明强庄会伟贲晛烨
Owner SHANDONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products