Three-dimensional gesture estimation method and three-dimensional gesture estimation system based on depth data

A technology of depth data and attitude estimation, which is applied in the field of robot vision and can solve problems such as misdetection and hand attitude estimation.

Active Publication Date: 2016-03-09
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF5 Cites 51 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0011] This invention proposes a hand detection method based on a depth map. This method first performs foreground segmentation through a depth threshold, then detects the forearm through straight line detection, and then detects the hand in the direction of the forearm. The position of the forearm has specific requirements, and the detection of straight lines is prone to false detection. At the same time, this method only detects the position of the hand, and does not estimate the posture of the hand. Both the method and the purpose are different from the present invention.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Three-dimensional gesture estimation method and three-dimensional gesture estimation system based on depth data
  • Three-dimensional gesture estimation method and three-dimensional gesture estimation system based on depth data
  • Three-dimensional gesture estimation method and three-dimensional gesture estimation system based on depth data

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0100] Describe technical scheme of the present invention in further detail below in conjunction with accompanying drawing:,

[0101] Such as figure 1 Shown, a 3D gesture estimation method based on depth data, it includes the following steps:

[0102] In this embodiment, the image acquisition device adopts Kinect2, and the Kinect2 sensor has the function of estimating the joint points of the human body, but it does not realize the joint points of the hand, and only a small amount of 2 points are given to represent the joints of the hand, while Kinect2 It can obtain human skeleton information more accurately in complex environments, so hand ROI data acquisition based on a single bone point in the palm is proposed. In addition, Kinect2 may not be able to obtain bone information due to the distance, posture, etc. of the person. Therefore, a hand ROI acquisition based on skin color detection is proposed for this situation. Calculation process such as figure 2 shown.

[0103] ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a three-dimensional gesture estimation method and a three-dimensional gesture estimation system based on depth data. The three-dimensional gesture estimation method comprises the following steps of S1, performing hand region of interest (ROI) detection on photographed data, and acquiring hand depth data, wherein the S1 comprises the processes of (1), when bone point information can be obtained, performing hand ROI detection through single bone point of a palm; (2) when the bone point information cannot be obtained, performing hand ROI detection in a manner based on skin color; S2, performing preliminary estimation in a hand three-dimensional global direction, wherein the S2 comprises the processes of S21, performing characteristic extracting; S22, realizing regression in the hand global direction according to a classifier R1; and S3, performing joint gesture estimation on the three-dimensional gesture, wherein the S3 comprises the processes of S31, realizing gesture estimation according to a classifier R2; and S32, performing gesture correction. According to the three-dimensional gesture estimation method and the three-dimensional gesture estimation system, firstly cooperation of two manners is utilized for dividing hand ROI data; afterwards estimation in the hand global direction is finished according to a regression algorithm based on hand ROI data dividing; and finally three-dimensional gesture estimation is realized by means of the regression algorithm through utilizing the data as an aid. The three-dimensional gesture estimation method and the three-dimensional gesture estimation system have advantages of simple algorithm and high practical value.

Description

technical field [0001] The invention relates to the field of robot vision, and relates to a method and system for estimating a three-dimensional gesture posture based on depth data. Background technique [0002] With the continuous improvement of people's demand for human-computer interaction, new interactive technologies are gradually emerging, and gestures have become one of the most potential interactive technologies due to their natural and convenient characteristics. Compared with other joint parts, the hand has rich gestures and flexible manipulation, and plays a huge role in people's daily life. Therefore, the interaction technology based on gesture gestures will have a wide range of applications, such as gesture operation in virtual reality. Virtual objects; using gestures to control the robot to grab objects; gesture-based sign language recognition; inputting operation instructions on the interactive interface through gesture technology, etc. These demands from rea...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00
CPCG06V40/117G06V40/113
Inventor 程洪李昊鑫姬艳丽况逸群
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products