Gesture action recognition method based on Kinect

A technology of gesture action and recognition method, which is applied in the field of virtual reality and human-computer interaction, and can solve the problems of not considering the difference in speed and distance, etc.

Active Publication Date: 2020-03-20
XIAN UNIV OF TECH
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] Existing 3D gesture recognition methods recognize gestures based on features such as appearance contours, morphological topology, and internal skeletons. At the same

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Gesture action recognition method based on Kinect
  • Gesture action recognition method based on Kinect
  • Gesture action recognition method based on Kinect

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0083] The present invention will be described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0084] The technical scheme adopted in the present invention is, a kind of gesture action recognition method based on Kinect, specifically implement according to the following steps:

[0085] Step 1. Use the gesture main trend to characterize the gesture orientation and gesture posture, and realize the measurement of the difference between the gesture orientation and gesture posture of adjacent frames; use the distance between the gesture center points of adjacent frames to measure the motion speed of the gesture, and complete Extract keyframes of independent gesture sequences; specifically:

[0086] Step 1.1: Take the wrist joint point as the initial seed coordinate, and recursively traverse its neighborhood pixels to extract the gesture area and convert it into gesture point cloud data; specifically:

[0087] Step 1.1.1, obtain the human w...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a gesture action recognition method based on Kinect, and the method specifically comprises the steps: extracting an independent gesture sequence key frame, extracting an interactive gesture sequence key frame, achieving the measurement of the similarity between gesture movement path sequences based on a DTW algorithm, and carrying out the recognition of gesture movement paths in different spatial directions; and according to the gesture motion path, the independent gesture sequence key frame and the interactive gesture sequence key frame, identifying a gesture startingposture, a gesture sequence key frame and a gesture ending posture, and further performing gesture motion identification based on the motion path. According to the method, the gesture action is recognized on the basis of gesture motion path tracking, and a strategy for recognizing the gesture action based on the gesture starting posture, the gesture motion path and the gesture ending posture is provided.

Description

technical field [0001] The invention belongs to the technical field of virtual reality and human-computer interaction methods, and relates to a gesture recognition method based on Kinect. Background technique [0002] Gesture recognition technology is one of the key research contents of natural human-computer interaction. Gesture, as a natural means of human-computer interaction, can improve the interoperability in virtual scenes and bring more realistic and natural immersive experience, thus providing a basis for the completion of complex interactive tasks. possible. Gesture recognition technology is widely used, such as assisted driving in safe driving, sign language recognition for deaf-mute communication, etc. In short, gesture recognition technology has a wide range of applications in education, medical care, drones and other fields. [0003] At present, gesture recognition technology is mainly based on two-dimensional and three-dimensional. Two-dimensional gesture rec...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00
CPCG06V40/28Y02D10/00
Inventor 王映辉赵艳妮宁小娟王东
Owner XIAN UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products