Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Real-time gesture recognition method and system based on machine vision

A technology of gesture recognition and machine vision, applied in neural learning methods, character and pattern recognition, input/output of user/computer interaction, etc., can solve problems such as difficulties in gesture recognition, difficulties in implementing gesture recognition algorithms, and complex hand shapes

Active Publication Date: 2020-11-17
NANJING UNIV OF POSTS & TELECOMM +1
View PDF2 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] Purpose of the present invention: the present invention provides a real-time gesture recognition method and gesture recognition system based on machine vision, which uses a machine learning-based joint point recognition and extraction algorithm to solve the problem of difficult implementation of gesture recognition algorithms based on two-dimensional images. After the position of the hand, detect the joint points of the hand, and then use the information of the joint points of the hand to solve the difficult problem of gesture recognition caused by the complex shape of the hand and the low resolution of the hand

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Real-time gesture recognition method and system based on machine vision
  • Real-time gesture recognition method and system based on machine vision
  • Real-time gesture recognition method and system based on machine vision

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0073] Solve the difficult problem of hand positioning through human joint point recognition. After obtaining the hand position, detect the hand joint points, and then use the hand joint point information to solve the difficult problem of gesture recognition caused by complex hand shapes and low hand resolution. But at the same time, the difficulty of the joint point extraction algorithm lies in the following points:

[0074] (1) The local distinction is weak, for example, it is difficult to define the precise location of "neck" or "the last joint of the thumb";

[0075] (2) Some joints are blocked or invisible, such as being blocked by clothes, the human body itself, or by the background, and some parts of the human body are similar;

[0076] (3) The surrounding environment, lighting, and the size and resolution of the target that needs joint extraction relative to the image will all affect the effect of key point extraction.

[0077] In order to overcome these problems, it ...

Embodiment 2

[0134] The present embodiment proposes a real-time gesture recognition system based on machine vision on the basis of Embodiment 1, including a video capture module, a streaming media server, a GPU server and a result display terminal; the video capture module collects video streams that include hands , and uploaded frame by frame to the streaming media server; the streaming media server encodes and compresses the video stream and transmits it to the GPU server through the network; the GPU server decodes the video stream and processes each frame according to the identification method of embodiment 1 to obtain the result Tuple and visualization information; the result display terminal obtains and displays the visualization joint point information and result tuple.

[0135]In order to solve the problem of mismatch between the processing rate of video capture and upload (60Hz) and the processing rate of each frame (22Hz~28Hz) in the GPU server process, the video capture module fix...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a real-time gesture recognition method and system based on machine vision. A gesture recognition thought of four steps of human body joint point recognition, hand positioning,hand joint point recognition and gesture recognition based on human body joint point recognition, hand joint point recognition based on hand positioning and gesture recognition based on hand joint point recognition is provided. A neural network structure is designed according to task characteristics of human body joint recognition, hand joint recognition and gesture recognition, and meanwhile, ina link of realizing gesture recognition through hand joint point recognition, a gesture training data generation mode is designed and gesture recognition is realized through transfer learning. Finally, through lightweight design of the network and design of a data transmission structure of the system, the system can achieve real-time gesture and hand joint recognition.

Description

technical field [0001] The invention relates to the technical fields of artificial intelligence and computer graphics, in particular to a machine vision-based real-time gesture recognition method and a gesture recognition system. Background technique [0002] Gestures are a fairly natural way of expression. Compared with touch, which requires physical contact to achieve interaction, gestures allow non-contact interactions and can express rich semantics and contextual information such as emotions and attitudes. Research has shown that language and gesture share the same system in the brain, and that gesture production is directly linked to memory. Since gestures are a natural medium of communication, they are ideal for human-computer interaction. Compared with the traditional keyboard, mouse, touch screen and other interactive methods that require users to adapt to the input and output modes of the machine, the computer-adapted interactive method is not only more natural and...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06N3/04G06N3/08G06F3/01
CPCG06N3/084G06F3/017G06V40/28G06V40/107G06N3/045
Inventor 戴建邦徐小龙肖甫孙力娟董健王林
Owner NANJING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products