Depth map based hand feature point detection method

A feature point detection, depth map technology, applied in image enhancement, image analysis, image data processing and other directions, can solve problems such as connection, application range limitations, etc., to overcome the limitations.

Active Publication Date: 2015-09-09
BEIJING UNIV OF TECH
View PDF4 Cites 36 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The existing deep program application research mainly focuses on the research based on human body control. The somatosensory control and behavior recognition are realized by extracting human bones. These inspection algorithms require most of the human body to appear in the scene and cannot be connected with other targets. The scope of application restricted

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Depth map based hand feature point detection method
  • Depth map based hand feature point detection method
  • Depth map based hand feature point detection method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0017] This depth map-based hand feature point detection method includes the following steps:

[0018] (1) Hand segmentation: Use Kinect to collect human motion video sequences to extract hands, use OPENNI to obtain human hand position information through depth maps, and obtain palm points initially by setting search areas and depth threshold methods; use OPENCV The find_contours function obtains the contour of the hand; by finding the center of the largest inscribed circle in the contour of the hand, the palm point of the hand is accurately determined, and by calculating the shortest distance m between all internal points of the hand and the contour point, the maximum value M is found in the shortest distance , the inner point of the hand represented by M is the palm point, and the inscribed circle radius R=M;

[0019] (2) Feature point extraction: Design and implement an improved hand feature point (fingertip point and finger valley point) detection method based on CSS curva...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The present invention discloses a depth map based hand feature point detection method. The hand feature point detection method comprises the steps of: (1) acquiring a human body motion video sequence by utilizing Kinect for hand extraction, obtaining hand position information of a human body by utilizing OPENNI through the depth map, and preliminarily obtaining a palm point with a method of setting a search region and a depth threshold value; obtaining a hand contour by utilizing a find_contours function of OPENCV; accurately determining the palm point of the hand by finding the center of a maximum inscribed circle in the hand contour, and finding a maximum value M in the shortest distances by calculating the shortest distances m between all hand inner points and a contour point, wherein the hand inner point represented by M is the palm point, and the radius R of the inscribed circle is equal to M; (2)by continuously performing Gaussian smoothing on the hand contour, obtaining a CSS curvature graph in combination with a curvature threshold value, analyzing a limit value according to the CSS contour in the drawing to obtain coordinates of a finger tip point and a finger valley point of the hand, and completing the finger valley point unavailable according to the CSS curvature graph; and (3) completing a missing finger.

Description

technical field [0001] The invention belongs to the technical field of computer pattern recognition and computer vision, and in particular relates to a hand feature point detection method based on a depth map. Background technique [0002] Gesture interaction is an important interaction method in the research of new human-computer interaction. This kind of interaction is non-contact and natural interaction, which is more in line with the natural behavior of human beings. Therefore, gesture-based interaction will be the development trend of human-computer interaction in the future. . Gesture recognition technology involves many disciplines such as artificial intelligence, pattern recognition, machine learning, and computer graphics. In addition, the research design of gestures covers many disciplines such as mathematics, computer graphics, robot kinematics, and medicine. Therefore, the research on gesture recognition has very important research value and research significan...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06T7/20
CPCG06T7/215G06T7/246G06T2207/10016G06T2207/10028G06T2207/10024G06F18/40
Inventor 孔德慧李淳王少帆尹宝才
Owner BEIJING UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products