Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Gesture Recognition Method Based on Leap Motion and Kinect

A gesture recognition and gesture technology, applied in the field of human-computer interaction, can solve the problems of insufficient gesture accuracy and achieve the effect of low cost, small size and improved accuracy

Inactive Publication Date: 2019-03-01
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This method overcomes the problem of insufficient accuracy of gesture recognition using only one of the sensors, making gesture recognition more accurate

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Gesture Recognition Method Based on Leap Motion and Kinect
  • Gesture Recognition Method Based on Leap Motion and Kinect
  • Gesture Recognition Method Based on Leap Motion and Kinect

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0045] For the convenience of description, the relevant technical terms appearing in the specific implementation are explained first:

[0046] figure 1 It is a flow chart of the present invention's gesture recognition method based on Leap Motion and Kinect.

[0047] In this example, first follow the image 3 As shown in the connection hardware, it is necessary to directly connect the two front-end devices of Leap Motion and Kinect with the data cable of the PC.

[0048] After the hardware connection is complete, combine the figure 1 A kind of gesture recognition method based on Leap Motion and Kinect of the present invention is described in detail, specifically comprises the following steps:

[0049] (1), use the Leap Motion sensor to obtain the relevant point coordinates of the hand and the gesture feature information of the hand;

[0050] (1.1), the Leap Motion sensor establishes a space coordinate system, the origin of the coordinate system is the center of the sensor, ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a gesture recognition method based on Leap Motion and Kinect. The 3D position of the fingertip, the position of the center of the palm and the direction information of the hand are obtained through the Leap Motion sensor, and the angle of the finger, the distance from the fingertip to the center of the hand and the The height of the fingertip is three kinds of feature information; at the same time, the Kinect sensor is used to obtain the depth information and color information of the scene, and the hand area is extracted, and then the three gesture feature information of circularity, filling rate, and perimeter ratio are obtained; Leap Motion is integrated The gesture feature information obtained by the two sensors of Kinect and Kinect collects several samples for each gesture that needs to be recognized to form a training sample set, and uses these sample sets to train the SVM classifier; finally, input the gesture that needs to be recognized into the trained SVM classification The device then recognizes gestures.

Description

technical field [0001] The invention belongs to the technical field of human-computer interaction, and more specifically relates to a gesture recognition method based on Leap Motion and Kinect. Background technique [0002] Traditionally, the mouse and keyboard, a classic human-computer interaction method, has been used for decades. The computer interaction interface has developed from a black and white screen to a variety of more user-friendly color interfaces. The functions of the computer have also changed from the early The simple calculation of the computer has been developed into various applications now. With the continuous improvement of computer performance and continuous updating of applications, the interaction mode of mouse and keyboard has begun to limit people's application experience, and people need a more free and convenient interaction mode. Gesture-based human-computer interaction can well meet this requirement, so the research on gesture recognition beco...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F3/01G06K9/00G06K9/62
CPCG06F3/017G06V40/113G06F18/2411
Inventor 刘珊郑文锋曾庆川杨波李晓璐曹婷婷
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products