Video fingertip positioning method based on Kinect

A positioning method, fingertip technology, applied in the input/output of user/computer interaction, computer parts, graphic reading, etc., can solve the problem of unable to handle the fingertip facing the camera

Active Publication Date: 2013-07-31
SOUTH CHINA UNIV OF TECH
View PDF2 Cites 64 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Plus, they can't handle having your fingertip pointed directly at the camera
The 3D modeling method can obtain accurate positionin...

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Video fingertip positioning method based on Kinect
  • Video fingertip positioning method based on Kinect
  • Video fingertip positioning method based on Kinect

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0064] Such as figure 1 Shown is a block diagram of the system structure of the present invention. After the user's handwriting video is acquired by Kinect, the hand is segmented to separate the region of interest of the hand from the background. After segmentation, ellipse fitting is performed on the palm and palm points are obtained. Then the arm points are obtained by quadratic depth thresholding segmentation method. After that, the system is divided into two modules: "multi-finger positioning" module and "multi-finger positioning" module. In the "Multipoint Positioning" module, polygon fitting is performed on the contour of the hand to remove contour noise. Then calculate its convex shape, and use the fitting polygon and convex shape to form a convex defect. Candidate fingertip points are obtained by calculating the curvature of convex defect vertices. The opponent's part is partitioned and the misjudged fingertips from the previous step are screened out. In the "sing...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a video fingertip positioning method based on Kinect, which comprises the following steps: (1), collecting video information; (2), analyzing, processing and segmenting the video information to obtain a hand shape of a user; (3), carrying out ellipse fitting of a palm shape of the user to obtain an ellipse, and taking the center of the ellipse as a central point of the palm shape; (4), positioning an arm point of the user; and (5), positioning fingertips of the user. The positioning method comprises multiple finger positioning and single finger positioning, and has the advantages of excellent robustness, and the like.

Description

technical field [0001] The invention relates to a computer image processing and pattern recognition technology, in particular to a Kinect-based video human fingertip positioning method. The method uses a Kinect sensor as an imaging tool. Background technique [0002] In recent years, intelligent human-computer interaction technology based on human hands has occupied an increasingly important position due to its flexible and natural characteristics. Compared with traditional technical means that require access to hardware devices, it is closer to the core concept of human-computer interaction and has a better user experience. The high freedom and flexibility of the middle fingertips make it contain richer connotations than other parts. Changes in the number and position of fingertips can be mapped to rich semantics. On this basis, a series of applications can be developed, such as virtual writing and drawing, remote gesture control, virtual keyboard, somatosensory games and...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F3/01
Inventor 金连文叶植超张鑫
Owner SOUTH CHINA UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products