Gesture identification method and apparatus

A gesture recognition and gesture technology, applied in the field of target recognition, can solve the problems of complex calculation, low recognition efficiency, and failure to achieve real-time, etc., and achieve high recognition efficiency, high recognition accuracy and efficiency, and reduced computational complexity.

Inactive Publication Date: 2016-10-12
SUZHOU UNIV
View PDF2 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The first category is based on statistical methods: for a dynamic gesture, it is regarded as the output of a random process, and gesture recognition can be determined according to statistical models, such as PCA, HMMS, particle filter, enrichment algorithm, etc., but calculation Complex, low recognition efficiency, unable to meet real-time requirements
The second type is a rule-based method: first, a series of templates are preset according to the input features. When a gesture is recognized, a series of features of the gesture are obtained and matched with the preset template features, which match the input gesture best. The template is output as the category of the gesture to be recognized, such as Shape contexts, Thresholding+FEMD, Near-convex+FEMD, etc., but the recognition accuracy and recognition efficiency cannot be guaranteed at the same time

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Gesture identification method and apparatus
  • Gesture identification method and apparatus
  • Gesture identification method and apparatus

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0049] Embodiment one: see figure 1 Shown, a kind of gesture recognition method, described method comprises the steps:

[0050] S1. Obtain the gesture shape to be recognized, and extract a closed contour from the edge of the gesture shape to be recognized, and obtain all contour points on the contour and the coordinates of each contour point;

[0051] It should be noted that the target shape designed in the present invention can be a shape with a closed contour, such as figure 2 Shown is a specific example of the object shape involved in the present invention. In addition, the number of contour points is the number of all points on the contour point, and its specific value is determined according to the actual situation, and the contour feature that completely represents the shape of the gesture shall prevail.

[0052] In a digital image, the edge of a shape can be represented by a series of contour points with coordinate information, and the set S of contour points of the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a gesture identification method. The method comprises: a shape of a to-be-identified gesture is obtained, a closed profile is extracted from a gesture-shaped edge, and all profile points on the profile and a coordinate of each profile point are obtained; the layer number of the profile is determined, and an area parameter, an arc length parameter, and a gravity center parameter that correspond to each profile point at each layer are calculated based on the coordinate of each profile point, wherein the parameters are used as feature parameters of the profile point; with the feature parameters of each profile point, a to-be-identified gesture and a template in a preset template base are matched to obtain an optimal matching template, and the optimal matching template is determined as the to-be-identified gesture. According to the invention, a global feature, a local feature, and a relation between the global feature and the local feature are described; multi-scale and omnibearing analysis expression is carried out; effective extraction and expression of the global feature and the local feature of the to-be-identified gesture shape can be realized; and a phenomenon of low identification accuracy caused by a single feature can be avoided.

Description

technical field [0001] The invention relates to a gesture recognition method and device, belonging to the technical field of target recognition. Background technique [0002] Gesture recognition is of great significance in the field of human-computer interaction, and has a wide range of applications in virtual reality, sign language recognition, and human-computer games. [0003] The difficulty of the previous gesture recognition technology lies in the acquisition of gestures. With the development of the depth camera, the Kinect sensor has solved this problem very well, and then the difficulty of gesture recognition has focused on the recognition efficiency and accuracy. [0004] Generally, gesture recognition methods can be divided into two categories. The first category is based on statistical methods: for a dynamic gesture, it is regarded as the output of a random process, and gesture recognition can be determined according to statistical models, such as PCA, HMMS, parti...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/46G06K9/40
CPCG06V40/113G06V40/117G06V10/30G06V10/44
Inventor 杨剑宇何溢文徐浩然
Owner SUZHOU UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products