Hand posture estimation method based on depth information and calibration method

A correction method and depth information technology, applied in the field of robot vision, can solve problems such as occlusion, the difficulty of completely segmenting the hand area, and the inability to express various gestures.

Active Publication Date: 2016-10-26
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF5 Cites 50 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this method is based on the point cloud model established by the hand, and the three-dimensional gestures are varied and occluded, so it is difficult to complete the point cloud model, so this method is more complicated.
[0006] (2) A method and system for estimating three-dimensional gestures and attitudes based on depth data, the application number is 201510670919.6; the invention uses the Kinect sensor to obtain hand data, and on this basis, extracts features and adds a discriminant model to realize palm gestures and fingers respectively Pose regression, but the method is too complex
[0007] (3) A gesture recognition method and device, the application number is 201410036739.8; the invention obtains the color and depth data of the hand, uses the color image to complete the analysis of the hand depth data, and uses adaptive weighting to extract features in the hand contour area , based on the classifier to complete the type recognition of the current gestur...

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Hand posture estimation method based on depth information and calibration method
  • Hand posture estimation method based on depth information and calibration method
  • Hand posture estimation method based on depth information and calibration method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0076] The technical solution of the present invention will be further described in detail below in conjunction with the accompanying drawings, but the protection scope of the present invention is not limited to the following description.

[0077] Such as figure 1 and figure 2 As shown, a hand pose estimation method based on depth information and correction method includes the following steps:

[0078] S1. Obtain hand depth data, and segment hand regions from the hand depth data.

[0079] This embodiment is mainly based on depth data, and its purpose is to estimate the posture state of the hand in the depth data. This implementation uses depth data as input. Compared with traditional color cameras, the depth sensor can obtain the distance information of the object to be photographed, and it is easy to segment the target and the background. In this embodiment, the Kinect2 sensor is taken as an example.

[0080] The step S1 includes the following sub-steps:

[0081] S11. Ob...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a hand posture estimation method based on depth information and a calibration method. The method comprises the steps that at the step S1, hand depth data is acquired, and a hand area is partitioned from the hand depth data; at the step S2, a palm posture is detected according to the hand area; at the step S3, positions of each articulation point of the hand are computed in combination with the palm posture and a hand standard bone model; at the step S4, projection characteristics of each articulation point of the hand are computed; and at the step S5, finger posture calibration is carried out according to the projection characteristics of each articulation point of the hand. The method disclosed by the invention is characterized in that the depth data is taken as the basis; the hand area is partitioned; the palm posture is computed; and the finger posture is estimated according to depth images through the posture calibration. The method is simple and practical.

Description

technical field [0001] The invention relates to the technical field of robot vision, in particular to a hand posture estimation method based on depth information and correction methods. Background technique [0002] In recent years, human-computer interaction technology has played an increasingly important role in life, and convenient and comfortable interaction methods can greatly enhance people's interactive experience. Although traditional interactive methods such as keyboard and mouse can meet daily interactive input, they are limited in terms of convenience and distance. Gesture technology has become a research hotspot in recent years, and there are endless ways to interact with hands. As the most flexible organ of the human body, the hand has a high degree of freedom and flexibility, so it can not only complete the work in daily life, but also meet the interactive input represented by gestures in the future. Among them, the gesture technology has also been developed ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F3/01G06K9/00
CPCG06F3/017G06V40/20
Inventor 姬艳丽程洪李昊鑫
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products