Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Hand gesture identification method based on Leap Motion and Kinect

A gesture recognition and gesture technology, applied in the field of human-computer interaction, can solve the problems of insufficient gesture accuracy and achieve the effects of low cost, improved accuracy, and high precision

Inactive Publication Date: 2017-04-26
UNIV OF ELECTRONIC SCI & TECH OF CHINA
View PDF4 Cites 26 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This method overcomes the problem of insufficient accuracy of gesture recognition using only one of the sensors, making gesture recognition more accurate

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Hand gesture identification method based on Leap Motion and Kinect
  • Hand gesture identification method based on Leap Motion and Kinect
  • Hand gesture identification method based on Leap Motion and Kinect

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0045] For the convenience of description, the relevant technical terms appearing in the specific implementation are explained first:

[0046] figure 1 It is a flow chart of the present invention's gesture recognition method based on Leap Motion and Kinect.

[0047] In this example, first follow the image 3 As shown in the connection hardware, it is necessary to directly connect the two front-end devices of Leap Motion and Kinect with the data cable of the PC.

[0048] After the hardware connection is complete, combine the figure 1 A kind of gesture recognition method based on Leap Motion and Kinect of the present invention is described in detail, specifically comprises the following steps:

[0049] (1), use the Leap Motion sensor to obtain the relevant point coordinates of the hand and the gesture feature information of the hand;

[0050] (1.1), the Leap Motion sensor establishes a space coordinate system, the origin of the coordinate system is the center of the sensor, ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a hand gesture identification method based on Leap Motion and Kinect. The hand gesture identification method comprises the steps of obtaining fingertip 3D position, hand palm center position and hand direction information via a Leap Motion sensor, and calculating out three types of characteristic information, namely finger angles, distances of fingertips to a hand palm center, and heights of the fingertips, according to these information; and meanwhile, obtaining depth information and color information of a scene by employing a Kinect sensor, and by extracting a hand area, then obtaining three types of hand gesture characteristic information, namely the circularity, a filling rate and a perimeter ratio; fusing the hand gesture characteristic information obtained by the Leap Motion and Kinect sensors, collecting a plurality of samples for each type of hand gestures which need to be identified to form training sample sets, and carrying out training on SVM classifiers by employing these sample sets; and at last, inputting the hand gestures which need to be identified to the trained SVM classifiers, thereby carrying out identification on the hand gestures.

Description

technical field [0001] The invention belongs to the technical field of human-computer interaction, and more specifically relates to a gesture recognition method based on Leap Motion and Kinect. Background technique [0002] Traditionally, the mouse and keyboard, a classic human-computer interaction method, has been used for decades. The computer interaction interface has developed from a black and white screen to a variety of more user-friendly color interfaces. The functions of the computer have also changed from the early The simple calculation of the computer has been developed into various applications now. With the continuous improvement of computer performance and continuous updating of applications, the interaction mode of mouse and keyboard has begun to limit people's application experience, and people need a more free and convenient interaction mode. Gesture-based human-computer interaction can well meet this requirement, so the research on gesture recognition beco...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F3/01G06K9/00G06K9/62
CPCG06F3/017G06V40/113G06F18/2411
Inventor 刘珊郑文锋曾庆川杨波李晓璐曹婷婷
Owner UNIV OF ELECTRONIC SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products