Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Gesture recognition method based on bp neural network

A BP neural network and gesture recognition technology, which is applied in the field of gesture recognition based on BP neural network, can solve problems such as the impact of segmentation results, and achieve the effects of simplifying operations, reducing the incidence of accidents, and increasing fun

Active Publication Date: 2020-06-19
NANJING UNIV OF SCI & TECH
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, when the weather is relatively hot, users tend to wear short sleeves, and because the color characteristics of the arms and palms are relatively close, it will have a certain impact on the segmentation results.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Gesture recognition method based on bp neural network
  • Gesture recognition method based on bp neural network
  • Gesture recognition method based on bp neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0062] The gesture recognition method based on the BP neural network of the present embodiment includes three parts of video acquisition, gesture segmentation and gesture recognition:

[0063] (1) Video acquisition

[0064] Get a video containing human hands through the camera.

[0065] (2) Gesture segmentation

[0066] The present invention adopts the method of color space combined with geometric features to realize gesture segmentation. Convert the RGB type image directly obtained by the camera to the YCrCb color space, perform adaptive threshold processing on the Cr channel, convert the image into a binary image, find the gesture contour as the input value for recognition, and finally remove the noise. Specific methods include:

[0067] 2.1 Color space conversion

[0068] The YCrCb color space has the characteristics of separating chroma and brightness. It has better clustering characteristics for skin color and is less affected by brightness changes. It can distinguish...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a gesture recognition method based on BP neural network, comprising the following steps: video acquisition, acquiring a video containing human hands through a video acquisition device; gesture segmentation, converting the RGB type image acquired by the video acquisition device into a YCrCb color space, Perform adaptive threshold processing on the Cr channel, convert the image into a binary image, obtain gesture contours, and remove noise; gesture recognition, determine the Hu invariant moments of the binarized image after noise removal as gesture features, and construct a three-layer BP neural network The network, using the Hu invariant moment value as the input of the neural network, is trained to recognize the gestures trained in the video. The invention can effectively segment the palm part and accurately identify specific gestures.

Description

technical field [0001] The invention belongs to the technical field of computer vision, in particular to a gesture recognition method based on BP neural network. Background technique [0002] With the wide application of computers, human-computer interaction has become an important part of people's daily life. Human-computer interaction refers to the information exchange process between humans and computers that uses a certain dialogue language to complete certain tasks in a certain interactive way. The ultimate goal of human-computer interaction is to realize the natural communication between man and machine, get rid of any form of interactive interface, and the way of inputting information becomes more and more simple and random. With the help of the fusion of artificial intelligence and big data, it can be very intuitive, Comprehensively capture people's needs and assist people in handling affairs. The human-computer interaction we are familiar with now mainly relies on...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00
CPCG06V40/107
Inventor 汪琦秦凯伦付潇聪李胜许鸣吉
Owner NANJING UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products