Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Gesture Recognition Method Based on Depth Sensor

A depth sensor and gesture recognition technology, which is applied in the field of gesture recognition based on depth sensors, can solve the problems of algorithm efficiency, recognition accuracy and stability model data packet size defects, restricting the application of gesture recognition technology, etc.

Active Publication Date: 2020-03-24
BEIJING HUAJIE IMI TECH CO LTD
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The existing gesture recognition methods based on depth information have defects in algorithm efficiency, recognition accuracy and stability, and model data packet size, which restricts the application of gesture recognition technology

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Gesture Recognition Method Based on Depth Sensor
  • A Gesture Recognition Method Based on Depth Sensor
  • A Gesture Recognition Method Based on Depth Sensor

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0060] The technical solutions of the present invention will be described in detail below in conjunction with the accompanying drawings.

[0061] A gesture recognition method based on a depth sensor, such as figure 1 As shown, the specific steps are as follows.

[0062] 1. Obtain depth flow information and human skeleton nodes through 3D depth sensor

[0063] 2. Gesture segmentation

[0064] After the depth device collects the depth flow information and human skeleton nodes, it will segment the hand area to obtain the 3D point cloud coordinates that only include the gesture area. The specific steps are as follows:

[0065] figure 2 The w*h plane image acquired for the depth device, which takes the center of the image as the origin. Because the depth value collected directly from the depth device is not the actual distance. So it needs to be converted to an actual depth distance value:

[0066] d=K*tan(d raw / 2842.5+1.1863)-0.037

[0067] K=0.1236m,d raw Represents ra...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a gesture recognition method based on a depth sensor. The steps are sequentially acquiring depth stream information and human skeleton node data; dividing gesture areas; frontalizing and binarizing 3D point cloud coordinates of the gesture areas; extracting gesture features , normalization and dimensionality reduction processing; screening the gesture features to obtain the optimal gesture feature subset; using the support vector machine to train the gesture classifier to obtain the classification results; filtering the gesture classification results. The invention makes up for the defects of the prior art, and improves the accuracy, stability and efficiency of gesture recognition.

Description

technical field [0001] The invention relates to the field of depth measurement and gesture classification, in particular to a gesture recognition method based on a depth sensor. Background technique [0002] Gesture recognition has always been a very important technology in human interaction applications. Gesture recognition based on depth information has inherent advantages compared with traditional computer vision-based gesture recognition. The extracted features on the graph affect the final recognition rate. [0003] The existing gesture recognition technology based on depth information generally extracts gesture contour features for classification. The methods based on shape features mainly include: (1) shape content analysis; (2) template matching; (3) Haus Doman distance; (4) ) Direction histogram; (5) Hu invariant distance. The existing gesture recognition methods based on depth information have defects in algorithm efficiency, recognition accuracy and stability, a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00G06K9/46G06K9/62
CPCG06V40/28G06V10/40G06F18/2411
Inventor 王行盛赞李骊杨高峰周晓军
Owner BEIJING HUAJIE IMI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products