Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

RGBD depth sensor-based real-time gesture analysis and evaluation method and system

A technology of depth sensor, evaluation method, applied in the field of image processing

Active Publication Date: 2017-12-12
CHANGAN UNIV
View PDF3 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to provide a real-time gesture analysis and evaluation system based on RGBD depth sensors, which solves the problem of detecting palms and arms under complex background and lighting conditions, and can analyze various gestures of train drivers, traffic police and other staff. Gestures are analyzed and evaluated, which has broad application prospects

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • RGBD depth sensor-based real-time gesture analysis and evaluation method and system
  • RGBD depth sensor-based real-time gesture analysis and evaluation method and system
  • RGBD depth sensor-based real-time gesture analysis and evaluation method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0077] A real-time palm gesture analysis and evaluation method based on RGBD depth sensor, is characterized in that, comprises the following steps:

[0078] Step 1, use the RGBD sensor to acquire T frames of initial images within a period of time, each frame of the initial images in the T frames of initial images includes palm nodes, wrist nodes and elbow nodes, and determine the coordinates of the palm nodes of the T frame initial images ;

[0079] include:

[0080] Step 11, choose an initial image from the T frame initial image as the initial image of the current frame, and obtain the coordinates of the palm node P in the initial image of the current frame through the initial palm node

[0081] Among them, M represents the number of white pixels in the area circle, M is a natural number greater than or equal to 1, x i Represents the abscissa of the i-th pixel, y i Represents the ordinate of the i-th pixel, z i Indicates the distance between the i-th pixel point and the R...

Embodiment 2

[0099] A real-time arm gesture analysis and evaluation method based on RGBD depth sensor, such as image 3 , including the following steps:

[0100] Step 1, use the RGBD sensor to acquire T frame initial images within a period of time, select an initial image from the T frame initial images as the t-th frame initial image, and extract the arm bone node movement sequence of the t-th frame initial image;

[0101] include:

[0102] The initial image of the tth frame includes the initial palm node P 1 t , wrist node P 2 t , Elbow node P 3 t , shoulder point P 4 t and shoulder center node P s t , get the node P through formula (3) n t To shoulder center node P s t distance D sn t :

[0103]

[0104] In formula (3), n=1,2,3,4, t=1,2,..., T, D sn t Indicates the node P in the initial image of frame t n t To shoulder center node P s t distance, T is the total number of frames of the initial image, x n t ,y n t ,z n t respectively represent the coordinat...

Embodiment 3

[0113] On the basis of Embodiments 1 and 2, this embodiment provides a real-time gesture analysis and evaluation method based on an RGBD depth sensor, including the real-time palm gesture analysis and evaluation method provided in Embodiment 1 and the method provided in Embodiment 2. Real-time arm gesture analysis and evaluation method. This embodiment can recognize the driver's palm gesture and arm dynamic gesture at the same time, and can perform normative evaluation on the driver's gesture according to the output result of the recognition algorithm and give the score of the palm gesture and arm dynamic gesture, not only can monitor the driver's gesture in real time In order to ensure the safety of the train, it can also avoid artificial monitoring of the train driver's gestures and reduce the consumption of human resources.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an RGBD depth sensor-based real-time gesture analysis and evaluation method and system. The real-time gesture analysis and evaluation system includes a static gesture recognition and evaluation system of a palm of a train driver and a dynamic gesture recognition and evaluation system of an arm of the train driver. The static gesture recognition and evaluation system of the palm of the train driver includes a palm centre position determination module, a palm region image extraction module, a denoising module and a gesture recognition and evaluation module. The dynamic gesture recognition and evaluation system of the arm of the train driver includes an arm skeleton node motion sequence extraction module, a dynamic gesture optimal-matching module and a dynamic gesture evaluation module of the arm. The real-time gesture analysis and evaluation system has very high robustness for environment backgrounds and illumination, adopts palm centre node-based gesture pixel searching when a palm gesture is detected, and improves a detection effect of the palm gesture; and the real-time gesture analysis and evaluation system supervises gestures of the driver in real time to ensure the safety of train driving, and can also avoid artificial monitoring on the gestures of the train driver, and reduce human resource consumption.

Description

technical field [0001] The invention belongs to the technical field of image processing, and in particular relates to a real-time gesture analysis and evaluation method and system based on an RGBD depth sensor. Background technique [0002] As one of the key technologies of future human-computer interaction systems, gesture recognition technology not only has important research value, but also has broad application prospects. At present, the traditional gesture recognition method usually detects and recognizes gestures on the input two-dimensional image. However, this type of method is more sensitive to the input image, and the gesture detection and recognition effect is better when the background is simple and the influence of ambient light is small. However, when the background is complex and the illumination changes greatly, the effect of gesture detection and recognition drops sharply, and the application range is limited. In recent years, in order to overcome the short...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00
CPCG06V40/20
Inventor 梁华刚易生孙凯李怀德
Owner CHANGAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products