Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Free-scene egocentric-vision finger key point detection method based on depth convolution nerve network

A deep convolution and neural network technology, applied in the research fields of computer vision and machine learning, to achieve the effect of training variability and enriching image features

Inactive Publication Date: 2016-06-29
SOUTH CHINA UNIV OF TECH
View PDF2 Cites 57 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The main purpose of the present invention is to overcome the shortcomings and deficiencies of the prior art, provide a free scene first-view finger key point detection method based on a deep convolutional neural network, solve the problem of finger key point detection in static images, and then apply it to video Finger keypoint recognition and tracking in streams

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Free-scene egocentric-vision finger key point detection method based on depth convolution nerve network
  • Free-scene egocentric-vision finger key point detection method based on depth convolution nerve network
  • Free-scene egocentric-vision finger key point detection method based on depth convolution nerve network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0046] Such as Figure 4 As shown, the finger key point detection method based on the first perspective of the free scene of the deep convolutional neural network includes the following steps:

[0047] S1. Obtain the training data, assuming that the area containing the hand (foreground area) has been obtained through a suitable positioning technology, manually mark the coordinates of the key points of the finger, including fingertips and finger joints;

[0048] S1.1 Collect a large number of actual scene samples, and use the camera at the glasses as the first perspective simulation (such as Figure 1(a)-Figure 1(b) Shown), make a large number of video recordings and make each frame of video gesture gestures, data samples need to cover different scenes, lighting, and postures. Then, cut out a rectangular foreground image containing the hand region;

[0049] In step S1.1, the gesture gesture is a single-finger gesture gesture, the coordinates are manually marked, and the finger...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a free-scene egocentric-vision finger key point detection method based on a depth convolution nerve network. The method comprises following steps of S1: obtaining training data, obtaining a region including a hand via proper positioning technology and manually marking coordinates where finger key points locates, wherein the finger key points include finger tips and finger joint pints; S2: designing the depth convolution nerve network and by using the depth convolution nerve network, solving a point coordinate regression problem; S3: training weighing parameters of the depth convolution nerve network via a large number of marking samplings, and after the weighing parameters become stable via certain iteration, obtaining multiple layer of convolution kernel parameters; and S4: using any one foreground picture as input and precisely obtaining finger key point coordinates after calculation of network parameters. The detection method is quite high in precision and good in robustness.

Description

technical field [0001] The present invention relates to the research fields of computer vision and machine learning, in particular to a method for detecting finger key points from the first perspective of a free scene based on a deep convolutional neural network. Background technique [0002] In recent years, with the rise of smart glasses, EgocentricVision gesture interaction technology has attracted widespread attention from academia and industry, especially the emergence of smart wearable devices such as Google Glass and Microsoft Hololens, and virtual reality devices such as Oculus. Human-computer interaction methods are difficult to apply, and an algorithm is urgently needed to help devices understand human interaction needs, such as gesture operations. Gesture interaction technology mainly involves two aspects, gesture recognition and key point positioning. The present invention focuses on key point positioning, namely fingertip detection and positioning and knuckle de...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06N3/08
CPCG06N3/084G06V40/107
Inventor 金连文黄毅超刘孝睿张鑫
Owner SOUTH CHINA UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products