Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Deep learning hand portion detection method based on hand portion area prediction

A detection method and hand technology, applied in computer parts, instruments, biological neural network models, etc., can solve problems such as the lack of time context information, overcome the complex and changeable hand shape and the change of light intensity, avoid tracking failure, improve effect of sensitivity

Active Publication Date: 2018-05-22
UNIV OF SCI & TECH OF CHINA
View PDF7 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The present invention overcomes the lack of temporal context information in single-image hand detection, alleviates the difficulty in detection of human hands caused by motion blur, occlusion, and appearance of new hands, and enhances the accuracy and robustness of the human hand detection system

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Deep learning hand portion detection method based on hand portion area prediction
  • Deep learning hand portion detection method based on hand portion area prediction
  • Deep learning hand portion detection method based on hand portion area prediction

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0043] The technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only a part of the embodiments of the present invention, not all the embodiments. Based on the embodiments of the present invention, all other embodiments obtained by those of ordinary skill in the art without creative work shall fall within the protection scope of the present invention.

[0044] 1. Such as figure 1 As shown, training a deep convolutional network, using the trained deep convolutional network to detect the hands (left hand, right hand, and overlapping hands) in the first frame of the video stream in a complex background, including;

[0045] Obtain a hand video stream dataset containing a variety of different scenes and complex backgrounds, and manually calibrate the tags in the dataset. The tags include the coordinate...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a deep learning hand portion detection method based on hand portion area prediction. The hand portion is divided into a left hand, a right hand and overlapped hands. The deep learning hand detection method based on hand area prediction includes the steps: training a deep convolutional network, using the trained network to detect the hand portion class and the area in the first frame of the video stream in the complicated background; according to the correlation, between the hand portion and the time and space, generated from movement inertia of the hand portion, using atracking algorithm to predict the hand portion area in the second frame, and combining with the adjacent frame difference method, acquiring the area shielded by the hand portion and a newly appearedhand portion area; constructing a mask by means of the areas obtained through the tracking algorithm and the adjacent frame difference method, increasing the interesting part in the image, and forminga frame image added with attention; inputting the image into the trained deep convolutional network to detect, so as to obtain the accurate hand portion class and area; and until the last frame, applying the detection method being identical to the detection method of the second frame, thus realizing video stream hand portion detection in the complicated background.

Description

Technical field [0001] The invention relates to a method for detecting hands in a video sequence under a complex background. The hands are divided into three categories: left-hand, right-hand and overlapped hands, and belong to the field of video object detection. Background technique [0002] In the existing vision-based human hand detection field, there are mainly feature detection method, template matching method, image difference method and so on. Most hand detection methods use hand skin color [1,2,3,4], palm texture [5,6] and hand shape [2,4,5,6] as detection features. Due to the complex background (the picture contains a large number of skin-like areas), light changes, complex and variable shapes of human hands, and many occlusions, there has been no particularly stable and mature detection method for hands. With the development of depth cameras (Kinect sensors, Xtion sensors provided by ASUS, etc.), depth information is widely used in hand detection[7,8]. The application...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62G06N3/04
CPCG06V40/107G06V20/41G06N3/045G06F18/214
Inventor 叶中付王瑾薇黄世亮
Owner UNIV OF SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products