Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for processing space hand signal gesture command based on depth camera

A depth camera and space gesture technology, applied in image data processing, instruments, computing, etc., can solve the problems of large computing data, complex device structure, and inability to acquire data accurately and quickly

Inactive Publication Date: 2013-03-20
GUILIN UNIV OF ELECTRONIC TECH
View PDF3 Cites 113 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In the static gesture detection part of the above literature, the initial dynamic tracking area is determined by detecting the specific area in the adjacent frame difference of the two-dimensional image data, and the static hand shape detection is performed in combination with the hand skin color model. Complex environment; in the extraction of dynamic gesture features, the dynamic gesture features are extracted by performing Freeman eight-direction encoding processing on the gesture trajectory on a two-dimensional plane. The calculation data is large, and the data cannot be obtained accurately and quickly
At the same time, the above-mentioned literature captures two-dimensional information around the display space through a two-dimensional camera horizontal plane, and the device structure is complex and the efficiency is low

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for processing space hand signal gesture command based on depth camera
  • Method for processing space hand signal gesture command based on depth camera
  • Method for processing space hand signal gesture command based on depth camera

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] The present invention as Figure 1-2 shown, including the following steps:

[0032] The first step is to obtain real-time images through the depth camera, and the images include depth images and RGB color images;

[0033]The depth camera is a camera based on the principle of structured light encoding and capable of collecting RGB images and depth images. The depth image includes two-dimensional XY coordinate information of the scene, and pixel depth value information reflecting the distance from the camera in the scene. The depth value is represented by the infrared light ranging receiving reflection distance of the IR camera, which is expressed as a gray value in the depth image; the greater the depth value, the farther the distance from the camera plane in the corresponding actual scene, that is, the closer the camera is. The larger the depth value.

[0034] When using the depth camera to capture images, the frame rate is set to 30FPS, and the size of the images ca...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for processing a space hand signal gesture command based on a depth camera and relates to the method for processing the space hand signal gesture command based the depth camera. The method for processing the space hand signal gesture command based the depth camera is capable of recognizing space hand signal gesture command information fast and accurately, improving working efficiency and precision greatly and being high in robustness, strong in practical applicability and good in anti-jamming capability when applied to a complex and changeable environment. The method comprises the steps of acquiring a real-time image by the depth camera, obtaining hand signal point cloud data by using three-dimensional point cloud computation and obtaining hand signal point cloud information, achieving a plane registering of the hand signal point cloud information and extracting contour feature point information, resuming a hand signal gesture, recognizing the hand signal gesture, recognizing a corresponding movement track and defining operation content of the movement track and finally achieving data smoothing of a dynamic hand signal gesture mouse output point according to a protocol for table-top tangible user interfaces (TUIO). The method for processing the space hand signal gesture command based the depth camera has the advantages of being fast , comprehensive and accurate in acquiring target information, establishing a space motion detecting area, extracting information with different depth, achieving multi-touch and improving integral operating performance.

Description

technical field [0001] The invention relates to the field of non-contact three-dimensional virtual space based on a depth camera, in particular to a method for processing space gesture gesture commands based on a depth camera. Background technique [0002] In recent years, with the rapid development and wide application of human-computer interaction, robotics, and virtual reality, the new technology of 3D interactive input has become a hot spot for many researchers in the field of human-computer virtual interaction. With the development and deepening of this technology, the public's demand for its use is getting higher and higher, and non-contact, high-speed, real-time positioning and three-dimensional operation have become the development direction of this technology. However, dynamic gestures are usually used to implement a 3D mouse simulation device, and a 3D space position sensor or a virtual 3D scene are used to assist in realization, and its operation methods are limit...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/20
Inventor 莫建文
Owner GUILIN UNIV OF ELECTRONIC TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products