Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Gesture recognition method based on depth sensor

A technology of depth sensor and gesture recognition, which is applied in character and pattern recognition, instruments, computer components, etc., can solve the problems that restrict the application of gesture recognition technology, algorithm efficiency, recognition accuracy and stability model data package size defects, etc.

Active Publication Date: 2018-02-27
BEIJING HUAJIE IMI TECH CO LTD
View PDF3 Cites 26 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The existing gesture recognition methods based on depth information have defects in algorithm efficiency, recognition accuracy and stability, and model data packet size, which restricts the application of gesture recognition technology

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Gesture recognition method based on depth sensor
  • Gesture recognition method based on depth sensor
  • Gesture recognition method based on depth sensor

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0060] The technical solutions of the present invention will be described in detail below in conjunction with the accompanying drawings.

[0061] A gesture recognition method based on a depth sensor, such as figure 1 As shown, the specific steps are as follows.

[0062] 1. Obtain depth flow information and human skeleton nodes through 3D depth sensor

[0063] 2. Gesture segmentation

[0064] After the depth device collects the depth flow information and human skeleton nodes, it will segment the hand area to obtain the 3D point cloud coordinates that only include the gesture area. The specific steps are as follows:

[0065] figure 2 The w*h plane image acquired for the depth device, which takes the center of the image as the origin. Because the depth value collected directly from the depth device is not the actual distance. So it needs to be converted to an actual depth distance value:

[0066] d=K*tan(d raw / 2842.5+1.1863)-0.037

[0067] K=0.1236m,d raw Represents ra...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a gesture recognition method based on a depth sensor. The method comprises the steps of sequentially obtaining depth flow information and human skeleton node data; cutting a gesture region; performing positive side and binaryzation are performed on 3D point cloud coordinates of the gesture region; gesture feature extraction, normalization and dimension reduction treatment are performed; gesture features are screened to obtain an optimal gesture feature subset; a support vector machine gesture training classifier is adopted, and a classification result is obtained; filtering is performed on the gesture classification result. The defect in the prior art is overcome, and the gesture recognition precision, stability and efficiency are improved.

Description

technical field [0001] The invention relates to the field of depth measurement and gesture classification, in particular to a gesture recognition method based on a depth sensor. Background technique [0002] Gesture recognition has always been a very important technology in human interaction applications. Gesture recognition based on depth information has inherent advantages compared with traditional computer vision-based gesture recognition. The extracted features on the graph affect the final recognition rate. [0003] The existing gesture recognition technology based on depth information generally extracts gesture contour features for classification. The methods based on shape features mainly include: (1) shape content analysis; (2) template matching; (3) Haus Doman distance; (4) ) Direction histogram; (5) Hu invariant distance. The existing gesture recognition methods based on depth information have defects in algorithm efficiency, recognition accuracy and stability, a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/46G06K9/62
CPCG06V40/28G06V10/40G06F18/2411
Inventor 王行盛赞李骊杨高峰周晓军
Owner BEIJING HUAJIE IMI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products