Artificial hand control method based on deep learning

A control method and deep learning technology, applied in neural learning methods, prosthetics, medical science, etc., can solve problems such as low classification accuracy and difficulty in extracting motor imagery EEG signal features, achieve high accuracy and avoid incompleteness Sexuality and labor-saving effect

Inactive Publication Date: 2019-05-10
SOUTHEAST UNIV
View PDF9 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The technical problem to be solved by the present invention is to overcome the deficiencies of the prior art, solve the problem of difficulty in extracting motor imagery EEG signal features, and low classification accuracy. The present invention provides a prosthetic hand control method based on deep learning, using deep learning theory, A new means of signal feature extraction and classification in BCI system, based on deep learning to realize natural prosthetic hand control

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Artificial hand control method based on deep learning
  • Artificial hand control method based on deep learning
  • Artificial hand control method based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032] Embodiments of the present invention will be described below in conjunction with the accompanying drawings.

[0033] Such as figure 1 As shown, the present invention has designed a kind of prosthetic hand control method based on deep learning, and this method specifically comprises the following steps:

[0034] Step 1. Select hand movements; the present invention specifically selects three types of hand movements: hand grip, finger pinch and rotation, all of which are common actions in real life scenarios.

[0035] Because traditional prosthetic hand control is controlled by imagining the movements of the left and right hands, feet and tongue, but there are uncoordinated and unnatural contradictions between the imagined actions and the actions performed by the prosthetic hand, the present invention selects three commonly used actions in daily life: , pinch and rotate three action categories as the classification target, making it closer to the real life scene.

[0036...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an artificial hand control method based on deep learning. The method comprises the steps of selecting an action type of the hand; collecting a motion imagery electroencephalogram, and dividing the motion imagery electroencephalogram into a training sample and a sample to be tested; preprocessing the motion imagery electroencephalogram, wherein low-pass filtering and Laplacespatial filtering are involved; extracting features in the training sample by utilizing wavelet transform, and generating a time-frequency two-dimensional image of the training sample; constructing aconvolutional neural network model with the time-frequency two-dimensional image as input and the motion imagery action type as output, training adjustment parameters, and obtaining a trained model through multi-fold cross validation; extracting features in the sample to be tested by utilizing wavelet transform, generating a time-frequency two-dimensional image and an input model, and obtaining and outputting the corresponding motion imagery action type to serve as a control instruction to control an artificial hand to complete corresponding action. According to the method, common hand actionin life is selected as a classification target and is closer to a natural state, and the method has the advantages of more sufficient information utilization and higher stability and accuracy.

Description

technical field [0001] The invention relates to a method for controlling a prosthetic hand based on deep learning, and belongs to the technical field of brain-computer equipment control. Background technique [0002] Brain-computer interface (Brain-computer interface, BCI) technology takes the brain as the control center, uses the computer to receive brain signals, and after processing and analysis, controls the external equipment to complete the corresponding control instructions. Offered a new way of life. A basic BCI system includes the following four parts: signal acquisition, feature extraction, feature classification and execution of control instructions. BCI technology has great research and application value in the fields of life assistance for the disabled, rehabilitation training for patients with limb injuries, game entertainment and smart home. [0003] The realization of BCI application depends on the good accuracy and robustness of electroencephalogram (Elect...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): A61F2/72A61F2/58A61F2/68G06N3/04G06N3/08
Inventor 徐宝国张琳琳宋爱国何小杭魏智唯李文龙张大林李会军曾洪
Owner SOUTHEAST UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products