A Fingertip Detection Method Based on Kinect Depth Information

A technology of fingertip detection and depth information, which is applied in the field of human-computer interaction, can solve problems such as cumbersome steps, complex algorithms, and poor real-time performance, and achieve the effect of accurate acquisition

Active Publication Date: 2020-11-13
CHANGCHUN UNIV OF SCI & TECH
View PDF7 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Many scholars have begun to use Kinect for gesture recognition. However, most of the existing methods have cumbersome steps, complex algorithms, and poor real-time performance.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Fingertip Detection Method Based on Kinect Depth Information
  • A Fingertip Detection Method Based on Kinect Depth Information
  • A Fingertip Detection Method Based on Kinect Depth Information

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0028] The implementation of the present invention will be described in detail below in conjunction with the drawings and specific examples.

[0029] Step S11, Kinect is placed on the desktop, the palm is perpendicular to the desktop, the palm faces the Kinect, and the distance from the Kinect is about one meter, such as figure 1 shown. Push the palm forward and then retract it to trigger the gesture tracking function of the NITE function library.

[0030] Step S12, use the NITE function library to obtain the coordinates of the palm, and then calculate the approximate depth range of the hand through the depth of the palm point.

[0031] Step S13, use the depth range obtained in S12 to set the search area and the depth threshold, use the depth binary mask to pass through an n*n matrix, and multiply the elements in n rows and n columns with the palm area to separate the hand image from the background .

[0032] Step S21 , first collect the depth image and the color image of t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a fingertip detection method based on Kinect depth information. The device Kinect is connected to a computer through a cable; it is characterized in that the specific steps are as follows: Step 1, extracting hands and obtaining palm coordinates: Step 2, fingertip positioning: including images Preprocessing and hand contour; joint bilateral filtering for the extracted hand area; use the Douglas-Puke algorithm to approximate the specified point set, find out the polygonal fitting curve of the contour and draw the fitting curve of the hand. Step 3. Use the convexHull() function to retrieve the above steps, analyze and obtain the convex hull points of the hand. Step 4. Calculate the curvature of the obtained convex hull point, and set an appropriate threshold to remove the convex hull point at the wrist according to the difference between the curvature of the wrist and the curvature of the fingertip. The method can complete gesture recognition tasks in real time and accurately, improves the real-time performance and accuracy of Kinect gesture recognition, and improves the natural gesture interaction experience.

Description

technical field [0001] The invention relates to a fingertip detection method based on depth information, belonging to the technical field of human-computer interaction. Background technique [0002] With the development of human-computer interaction technology, natural interaction has become the development direction of human-computer interaction technology. In recent years, using human hands for natural and intuitive interaction without wearing auxiliary equipment has gradually become a research hotspot in this field. [0003] The key to realizing natural human-hand interaction lies in the accurate recognition of gestures. At present, gesture recognition methods without wearing assistive devices mainly include methods based on color cameras and methods based on depth sensors. Gesture recognition based on color cameras is easily affected by complex lighting and background conditions, especially when the background is very complex, the recognition accuracy is extremely low....

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06F3/01G06K9/00
CPCG06F3/017G06F2203/012G06V40/107
Inventor 权巍张超韩成薛耀红李华胡汉平陈纯毅蒋振刚杨华民冯欣王蒙蒙
Owner CHANGCHUN UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products