Single-viewpoint gesture and posture recognition method based on Kinect

A posture recognition and single-view technology, applied in character and pattern recognition, instruments, computer parts, etc., can solve problems that affect feature extraction, smoothing, and do not take into account the characteristics of three-dimensional data

Active Publication Date: 2020-03-20
XIAN UNIV OF TECH
View PDF5 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] At present, most of the existing 3D gesture recognition methods are based on the existing denoising algorithm to denoise the 3D data, without considering the 3D data characteristics, so either excessive denoising leads to smoothing of gesture surface featur

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Single-viewpoint gesture and posture recognition method based on Kinect
  • Single-viewpoint gesture and posture recognition method based on Kinect
  • Single-viewpoint gesture and posture recognition method based on Kinect

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0094] The present invention will be described in detail below with reference to the drawings and specific embodiments.

[0095] A Kinect-based single-view gesture gesture recognition method of the present invention is specifically implemented according to the following steps:

[0096] Step 1. Taking the wrist joint point as the initial seed coordinates, recursively traverse the neighborhood pixels of the wrist joint point to extract the gesture area;

[0097] Step 1 specifically is:

[0098] Step 1.1, obtain the human wrist joint point coordinate P from the human skeleton information obtained by the real-time tracking of Microsoft Kinect; figure 1 As shown,

[0099] Step 1.2, take the wrist joint point P as the initial seed pixel point, and calculate the wrist joint point P as the point P in the eight neighborhoods i ,i∈[0,7] the depth difference dif i ,i∈[0,7], when dif i Less than the depth threshold T depth 时,Pixel point P i Belong to the gesture area, add it to the gesture area G,...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a single-viewpoint gesture and posture recognition method based on Kinect. The method specifically comprises the following steps: taking a wrist joint point as an initial seedcoordinate; performing recursive traversal on neighborhood pixels of a wrist joint point; extracting a gesture area, then performing denoising processing by adopting median filtering and mask image methods, then performing arm area elimination, and then performing curvature and flexibility-based 3D-SIFT feature point extraction to realize representation of gesture posture concave-convex features and measurement of gesture posture distortion degree; constructing a point feature descriptor by utilizing the main trend direction of the gesture three-dimensional gesture point cloud data, the gesture point cloud center point of the three-dimensional point cloud data and the 3D-SIFT feature points, and finally calculating the point feature descriptors of the source gesture point cloud and the target gesture point cloud for matching recognition and optimization. According to the single-viewpoint gesture and posture recognition method based on Kinect, the problem of single-viewpoint gesture andposture recognition in the gesture and action recognition process is solved.

Description

Technical field [0001] The invention belongs to the technical field of gesture and posture recognition methods in the cross-disciplinary combination of computer graphics and virtual reality, and relates to a Kinect-based single-viewpoint gesture and posture recognition method. Background technique [0002] Gesture recognition technology is one of the key research contents of natural human-computer interaction. Gestures, as a natural means of human-computer interaction, can improve interoperability in virtual scenes and bring a more realistic and natural immersive experience, thereby providing for the completion of complex interactive tasks It's possible. Gesture recognition technology is widely used, such as assisted driving in safe driving, sign language recognition for deaf-mute communication, etc. In short, gesture recognition technology has a wide range of applications in education, medical, drones and other fields. [0003] At present, gesture recognition technology is mainly...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06K9/46G06K9/62
CPCG06V40/28G06V10/462G06F18/22G06F18/2135
Inventor 王映辉赵艳妮宁小娟王东
Owner XIAN UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products