Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Finger tip point extraction method based on pixel classifier and ellipse fitting

A technology of ellipse fitting and extraction method, applied in instruments, character and pattern recognition, computer parts and other directions, can solve problems such as the difficulty of opening the operation window, and achieve the effect of low cost

Inactive Publication Date: 2015-11-11
JILIN JIYUAN SPACE TIME CARTOON GAME SCI & TECH GRP CO LTD
View PDF0 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

A disadvantage of this method is that it is not easy to determine the appropriate open operation window size

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Finger tip point extraction method based on pixel classifier and ellipse fitting
  • Finger tip point extraction method based on pixel classifier and ellipse fitting
  • Finger tip point extraction method based on pixel classifier and ellipse fitting

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0068] Segmentation of the hand region:

[0069] see Figure 1 to Figure 5 As shown, the segmentation in the two-dimensional direction: first use OpenNI to track the position of the palm point, and first segment the hand and arm area in the two-dimensional direction: the palm point is the center of the rectangle, and a rectangle containing the hand and arm area is segmented.

[0070] is the coordinates of the palm point in the two-dimensional direction. Indicates the side length of the rectangle.

[0071] Segmentation in the depth direction: extract the depth value of the hand point according to the depth map provided by Kinect , to determine the hand and arm regions. if And the point depth value satisfy , are the points in the hand and arm region, namely:

[0072] in Whether the marked pixel is the hand and arm area, Indicates the depth value of the tracked hand point, represent pixels The depth value at , k represents the maximum depth range of th...

Embodiment 2

[0074] Extract the skin color area:

[0075] see Figure 6 to Figure 9 As shown, due to the characteristics of the Kinect data, the edges of the hand and arm areas are relatively jagged, which will affect the accuracy of subsequent contour extraction. Therefore, after getting the hand and arm area, apply the Bayesian skin color model to remove those areas that are not skin color, and get the accurate hand and arm area.

[0076] 1. Creation of skin color model

[0077] In this paper, YCbCr color space is used to construct the skin color model, because of its discreteness and the identity of human visual perception, the separation of brightness and chrominance, and the compactness of skin color gathering areas. In the YCbCr color space, the luminance information is represented by a single component Y, and the color information is stored by two color difference components Cb and Cr, Cb represents the blue chroma, and Cr represents the red chroma.

[0078]

[0079] First, co...

Embodiment 3

[0087] see Figure 10 to Figure 21 As shown, since the detection of the hand area is carried out based on the depth value provided by Kinect, but the depth value provided by Kinect has errors, which leads to inaccurate contours of the hand area during motion, due to motion blur presence, the effect will be even worse. Fingertip detection algorithms based on templates and contours have low detection accuracy for slightly curved fingertips, while fingertip detection algorithms based on morphological operations are slower, time-consuming, and frequently misdetect fingertips . Therefore, the present invention selects the method based on the pixel classifier to extract the finger and fingertip area, and then uses ellipse fitting to obtain the position of the fingertip point.

[0088] (1) Extract finger area based on pixel classifier method

[0089] According to the geometric characteristics of fingertips, this paper defines a square as a pixel classifier, and classifies pixels a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a finger tip point extraction method based on a pixel classifier and ellipse fitting. A coordinate of a palm point provided by OpenNI (open type natural interaction) is employed to segment a hand area, a Bayes skin color model is employed to remove non-skin-color areas, and the hand area is accurately extracted; and the pixel classifier is defined for the clustering of finger areas, the contour of the finger areas is extracted, the least squares method is employed for ellipse fitting, and the point of two endpoints of the long axis of the ellipse, furthest away from the palm point, is the finger tip point. The method is advantageous in that finger tip points are accurately detected in real time based on a visual-only method, and compared with data glove and color glove methods, the method is more natural and comfortable.

Description

technical field [0001] The invention relates to a fingertip point extraction method based on a pixel classifier and ellipse fitting. Based on the Kinect depth camera, through the three-dimensional coordinates of the palm point tracked by OpenNI, the hand area is firstly segmented, and the skin color model extracts the accurate hand area, and then Extract the fingertips precisely. Background technique [0002] After Kinect and other depth cameras came out, many researchers used Kinect to do research on gesture recognition. Raheja et al. used Kinect to segment the hand area, and then searched the depth map according to the fact that the fingertip point has the minimum depth value to find the position of the fingertip point. This method is not effective when the hand moves fast and motion blur occurs. good. Ren et al. used the depth map of Kinect and tied a black belt on the wrist to obtain the precise hand area, then generated the time series images of the hand area, and per...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00
CPCG06V40/113
Inventor 郭双双潘志庚张明敏罗江林
Owner JILIN JIYUAN SPACE TIME CARTOON GAME SCI & TECH GRP CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products