BP neural network-based gesture recognition method

A BP neural network and gesture recognition technology, applied in the field of gesture recognition based on BP neural network, can solve problems such as the influence of segmentation results, and achieve the effects of simplifying operations, increasing interest, and reducing accident rates.

Active Publication Date: 2017-03-15
NANJING UNIV OF SCI & TECH
View PDF3 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, when the weather is relatively hot, users tend to wear short sleeves, and because the color characteristics of the arms and palms are relatively close, it will have a certain impact on the segmentation results.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • BP neural network-based gesture recognition method
  • BP neural network-based gesture recognition method
  • BP neural network-based gesture recognition method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0062] The gesture recognition method based on BP neural network in this embodiment includes three parts: video acquisition, gesture segmentation, and gesture recognition:

[0063] (1) Video acquisition

[0064] Obtain a video containing human hands through the camera.

[0065] (2) Gesture segmentation

[0066] The invention adopts the method of color space combined with geometric characteristics to realize gesture segmentation. Convert the RGB type image directly obtained by the camera to the YCrCb color space, perform adaptive threshold processing on the Cr channel, convert the image into a binary image, find the gesture contour as the input value for recognition, and finally remove the noise. Specific methods include:

[0067] 2.1 Color space conversion

[0068] The YCrCb color space has the characteristics of separating chroma and brightness, has better clustering characteristics for skin color, is less affected by brightness changes, and can well distinguish human skin color areas...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The present invention discloses a BP neural network-based gesture recognition method. The method includes the following steps of: video acquisition: a video containing a hand is acquired through a video acquisition device; gesture segmentation: an RGB type image acquired by the video acquisition device is converted into a YCrCb color space, adaptive threshold processing is performed on a Cr channel, so that the image can be converted into a binarized image, and a gesture contour is obtained, and noises are removed; gesture recognition: the Hu invariant moment of the noise-removed binarized image is determined and is adopted as a gesture feature, and a three-layer BP neural network is constructed, and training is carried out with the value of the Hu invariant moment adopted as the input of the neural network, so that the trained gesture in the video can be recognized. With the method of the invention adopted, a palm part can be effectively segmented, and a specific gesture can be accurately recognized.

Description

Technical field [0001] The invention belongs to the field of computer vision technology, in particular to a gesture recognition method based on BP neural network. Background technique [0002] With the widespread use of computers, human-computer interaction has become an important part of people's daily lives. Human-computer interaction refers to the process of information exchange between humans and computers that use a certain dialogue language between humans and computers to complete certain tasks in a certain interactive manner. The ultimate goal of human-computer interaction is to realize the natural communication between human and machine, get rid of any form of interactive interface, and the way of inputting information becomes simpler and more casual. With the help of the integration of artificial intelligence and big data, it can be very intuitive and Fully capture people's needs and assist people in handling affairs. The human-computer interaction that we know now mai...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00
CPCG06V40/107
Inventor 汪琦秦凯伦付潇聪李胜许鸣吉
Owner NANJING UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products