Motor imagery identification method and system fusing CNN-BiLSTM model and probability cooperation

A technology of motor imagery and recognition methods, applied in character and pattern recognition, neural learning methods, biological neural network models, etc.

Pending Publication Date: 2019-10-08
QILU UNIV OF TECH
View PDF5 Cites 23 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

LSTM is improved on the basis of the traditional RNN model. By adding mechanisms such as forgetting gate, memory gate and output gate, it can effectively avoid the occurrence of gradient disappearance and blasting, and then learn more ancient information. However, simply Sometimes it is not enough to deduce the information from the previous information to the latter information. How to better use the LSTM model to extract the timing characteristics of the EEG signal in order to achieve better results is an urgent need to solve the technology in the current technology question

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Motor imagery identification method and system fusing CNN-BiLSTM model and probability cooperation
  • Motor imagery identification method and system fusing CNN-BiLSTM model and probability cooperation
  • Motor imagery identification method and system fusing CNN-BiLSTM model and probability cooperation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0076] as attached figure 1 As shown, the motion imagery recognition method of the fusion CNN-BiLSTM model and probability cooperation of the present invention, this method captures and extracts the spatio-temporal depth feature in the EEG signal by the CNN model of fusion BiLSTM network, and the time of capturing and extracting - Input the empty depth feature to the ProCRC classifier for classification, and use the test set data to evaluate the performance of the built CNN-BiLSTM model to realize the user's intention recognition; the specific steps are as follows:

[0077] S1. Collecting EEG signals; wherein, the EEG signal is a non-invasive EEG signal with weak signal amplitude, low signal-to-noise ratio, non-stationarity, nonlinearity, and great processing difficulty;

[0078] Dataset III (DatasetIVa) in the international BCI Competition III competition database was used in the experiment. Dataset IVa is a two-type motor imagery task involving the right hand and feet, and E...

Embodiment 2

[0131] as attached Image 6 As shown, the fusion CNN-BiLSTM model of the present invention and the motion imagery recognition system of probability cooperation, this system comprises,

[0132] The EEG signal acquisition unit is used to collect EEG signals; among them, the EEG signal is a non-invasive EEG signal, which has weak signal amplitude, low signal-to-noise ratio, non-stationarity, and nonlinearity, and is difficult to process ;

[0133] The deep neural network construction unit is used to extract the spatio-temporal depth features of the EEG signal through the convolutional neural network (CNN) in deep learning combined with the bidirectional long-short-term memory model network (BiLSTM);

[0134] Among them, the deep neural network construction unit includes,

[0135] The CNN model constructs a subunit for extracting spatial features in the EEG signal through a convolutional neural network;

[0136] The BiLSTM model construction subunit is used to construct a two-w...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a motor imagery identification method and system fusing a CNN-BiLSTM model and probability cooperation, belonging to the field of man-machine interfaces. The invention aims tosolve the technical problem of how to effectively extract features and how to quickly realize classification, and the method adopts the technical scheme that (1) the method captures and extracts time-space depth features in an EEG signal by merging a CNN model of a BiLSTM network, and inputs captured and extracted time-space depth features into a ProCRC classifier for classification, and uses testset data pairs to construct the CNN-BiLSTM model for performance evaluation to achieve intent recognition. The method comprises the following specific steps: S1, acquiring an electroencephalogram signal; S2, constructing a deep neural network; S3, establishing a classifier; and S4, testing and evaluating the model. The system comprises an electroencephalogram signal acquisition unit, a deep neural network construction unit, a classifier construction unit and a model test and evaluation unit.

Description

technical field [0001] The present invention relates to the field of brain-computer interface, and more to the field of classification and recognition of electroencephalogram signals based on motor imagery, which combines a convolutional neural network and a bidirectional long-short-term memory model with a probabilistic cooperative representation classifier. A motion imagery recognition method and system integrating CNN-BiLSTM model and probabilistic collaboration. Background technique [0002] Brain-computer interface (Brain-Computer Interface, BCI) is a special communication system, it does not rely on the peripheral nerves and muscle tissue of the human body, it can realize the communication between the human brain and external equipment, this kind of communication does not require language or action control The communication method belongs to the category of brain control technology. In recent years, BCI technology has attracted the attention of many scholars at home a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06N3/045G06F2218/08G06F2218/12G06F18/241G06F18/25
Inventor 徐舫舟许晓燕郑文风苗芸菁荣芬奇
Owner QILU UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products