Parallel convolutional neural network motor imagery electroencephalogram classification method based on spatial-temporal feature fusion

A convolutional network and motion imagery technology, applied in biological neural network models, image data processing, graphics and image conversion, etc., to achieve the effect of improving classification performance

Active Publication Date: 2020-04-17
CHONGQING UNIV OF POSTS & TELECOMM
View PDF14 Cites 24 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the previous algorithms, due to the EEG acquisition equipment, only visualized time-series channel data, so most researchers mainly study how to extract EEG features in time series

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Parallel convolutional neural network motor imagery electroencephalogram classification method based on spatial-temporal feature fusion
  • Parallel convolutional neural network motor imagery electroencephalogram classification method based on spatial-temporal feature fusion
  • Parallel convolutional neural network motor imagery electroencephalogram classification method based on spatial-temporal feature fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0056] The technical solutions in the embodiments of the present invention will be described clearly and in detail below in conjunction with the drawings in the embodiments of the present invention. The described embodiments are only some of the embodiments of the invention.

[0057] The technical scheme that the present invention solves the problems of the technologies described above is:

[0058] As shown in the figure, the motor imagery EEG feature extraction and classification method based on spatio-temporal feature fusion provided by the present embodiment includes the following steps:

[0059] Step 1: Preprocess the raw data. Generally, the original EEG channel data obtained from the experiment contains noises such as myoelectricity and oculoelectricity, which are not suitable for direct network training. Therefore, BCI researchers will perform a series of data processing processes to improve the signal-to-noise ratio before feature extraction, such as high-pass filter...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a parallel convolutional neural network motor imagery electroencephalogram classification method based on spatial-temporal feature fusion. According to the invention, motion imagery electroencephalogram signals are used as research objects, and a novel deep network model-parallel convolutional neural network method is provided for extracting spatial-temporal features of themotion imagery electroencephalogram signals. Different from a traditional electroencephalogram classification algorithm which often discards electroencephalogram spatial feature information, Theta waves (4-8 Hz), alpha waves (8-12 Hz) and beta waves (12-36 Hz) are extracted through fast Fourier transform to generate a 2D electroencephalogram feature map; training of the electroencephalogram feature map is conducted based on a multi-convolutional neural network so as to extract spatial features; in addition, a time convolution neural network is used for parallel training so as to extract time-order features; and finally, the spatial features and the time-order features are fused and classified based on Softmax. Experimental results show that the parallel convolutional neural network has good recognition precision and is superior to other latest classification algorithms.

Description

technical field [0001] The invention belongs to the field of motor imagery EEG classification, in particular to a parallel convolutional neural network motor imagery EEG recognition method based on spatio-temporal feature fusion. Background technique [0002] As a comprehensive reflection of the physiological activities of scalp brain cells, EEG contains a large amount of physiological and disease information. The brain-computer interaction system (BCI) based on EEG signal communication can replace the transmission of brain nerves and muscle tissue as a signal transmission channel, thereby realizing the interaction between the brain and bionic machinery. As an extension of human-computer interaction, BCI has been widely concerned by scholars and researchers in the scientific community. EEG recognition based on motor imagery is a key node in BCI system interaction on interaction with the outside world. Motor imagery is imagined subjectively by the human brain, such as imagi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): A61B5/0476A61B5/00G06K9/62G06T3/40G06N3/04
CPCA61B5/7235A61B5/7253A61B5/7267G06T3/4007A61B5/7257A61B5/7203A61B5/369G06N3/045G06F18/214G06F18/24G06F18/253Y02A90/10
Inventor 唐贤伦孔德松邹密刘行谋马伟昌李伟王婷彭德光李锐
Owner CHONGQING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products