Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Gesture recognition method based on deep learning

A gesture recognition and deep learning technology, applied in the field of human-computer interaction, can solve the problems of many command operations, inconvenient buttons for drivers to use, complex control screen and button settings, etc., and achieve simple recognition process, high reliability and high adaptability sexual effect

Pending Publication Date: 2020-05-05
CSR ZHUZHOU ELECTRIC LOCOMOTIVE RES INST
View PDF7 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Complex on-board operating equipment not only improves the driving safety and comfort of the vehicle, but also increases the amount and complexity of the driver's operations
[0003] At present, human-computer interaction technology is gradually changing from a computer-centered to a human-centered interaction method, but the current driver control platform for public transportation vehicles often requires more command operations to be completed, and the control screen and buttons The setting is more complicated, and some buttons are arranged under the steering wheel or far away, which is not convenient for drivers to use

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Gesture recognition method based on deep learning
  • Gesture recognition method based on deep learning
  • Gesture recognition method based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] The detailed features and advantages of the present invention are described in detail below in the specific embodiments, the content of which is sufficient to enable any person skilled in the art to understand the technical content of the present invention and implement it accordingly, and according to the specification, claims and drawings disclosed in this specification , those skilled in the art can easily understand the related objects and advantages of the present invention.

[0031] see figure 1 , as a first aspect of the present invention, the present invention provides a gesture recognition method based on deep learning, including:

[0032] Image collection step: use a camera to collect video images, perform video decoding on the video images, and obtain input video data. The camera can be a network camera, an infrared camera or a 3D camera;

[0033] Neural network training steps: train the neural network model for hand position detection, gesture key point pos...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a gesture recognition method based on deep learning, and the method comprises an image collection step: collecting a video image, carrying out the video decoding of the video image, and obtaining input video data; a neural network training step: training a neural network model to perform hand position detection, gesture key point position extraction and gesture classification based on gesture key points; a gesture recognition step: carrying out hand position detection and gesture key point position extraction on the input video data by utilizing a neural network model, and carrying out gesture classification on gesture key points; an indication response step: obtaining a gesture operation instruction, and making a real-time response by the system according to the gesture operation instruction; in the neural network training step, at least three neural network models are adopted, and a data set in an online open database and / or a self-sampled and calibrated data set are / is used as input samples to train the neural network models.

Description

technical field [0001] The invention relates to the field of deep learning image recognition, in particular to the realization of human-computer interaction through gesture recognition by using target detection and target tracking methods. Background technique [0002] With the increasing intelligence of cars and roads, the way of processing information is constantly changing, and automotive electronic equipment is booming, and a large number of electronic devices continue to enter the vehicle driving space. Complicated on-board operating equipment not only improves the driving safety and comfort of the vehicle, but also increases the amount and complexity of the driver's operations. [0003] At present, human-computer interaction technology is gradually changing from a computer-centered to a human-centered interaction method, but the current driver control platform for public transportation vehicles often requires more command operations to be completed, and the control scr...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/28G06V40/107G06F18/2413
Inventor 冯江华胡云卿林军丁驰刘悦袁浩游俊熊群芳岳伟
Owner CSR ZHUZHOU ELECTRIC LOCOMOTIVE RES INST
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products