Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Scalp EEG feature extraction and classification method based on end-to-end convolutional neural network

A technology of convolutional neural network and classification method, which is applied in the field of scalp EEG feature extraction and classification based on end-to-end convolutional neural network, which can solve the problems of limited data volume, time-consuming and labor-intensive, complex parameter volume of deep neural network, etc. , to achieve high classification accuracy, simple structure, and enhanced network robustness

Active Publication Date: 2020-09-25
周军
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, the currently proposed method based on deep neural network still has the following problems. First, due to the great success of convolutional neural network (CNN) in computer vision, many studies have applied CNN to the field of EEG motion image recognition, but in The CNN used in existing work has a fixed convolution scale (that is, a fixed convolution kernel size), but for different people, the convolution scale with the highest classification accuracy is not the same, so the fixed convolution scale limits the classification accuracy. Second, due to the complexity of the deep neural network and the huge amount of parameters, network training requires a large amount of data, and high-quality EEG data needs to be carried out in an environment with low electromagnetic interference, which is a very time-consuming and labor-intensive task. Limited amount, easy to overfit

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Scalp EEG feature extraction and classification method based on end-to-end convolutional neural network
  • Scalp EEG feature extraction and classification method based on end-to-end convolutional neural network
  • Scalp EEG feature extraction and classification method based on end-to-end convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0036] Such as figure 2 As shown, the scalp EEG feature extraction and classification method based on the end-to-end convolutional neural network collects the original scalp EEG signals as training data in the training network stage, firstly performs data enhancement on the collected training data, and then lets the enhanced training The data is filtered by three band-pass filters, and finally the convolutional neural network is trained with the filtered training data; specifically, the convolutional neural network is trained using the BP algorithm;

[0037] In the data detection stage, the collected data to be detected are input into the trained convolutional neural network for feature extraction and classification, including the following steps:

[0038] S1. Use the x raw Represented, and then filtered by three band-pass filters, the obtained signals are respectively represented by x θ 、x μ and x β express;

[0039] S2. For the filtered scalp EEG signal x θ 、x μ and ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for extracting and classifying scalp EEG features based on an end-to-end convolutional neural network. Data enhancement is performed on training data, and then the enhanced training data is used to train the convolutional neural network; the data to be detected is input into the convolutional neural network for further processing. The feature extraction and classification steps are: S1. Filter and process the original scalp EEG signal with a band-pass filter to obtain the signal x θ 、x μ and x β ;S2, pair signal x θ 、x μ and x β Perform multi-scale temporal convolution and spatial convolution to extract features; S3, perform pooling operation on the feature map output by the convolutional layer; S4, perform feature fusion after pooling, and then send it to the fully connected layer to perform abstract feature extraction on the input Integration; S5, the output of the fully connected layer is sent to the softmax layer for classification. The present invention applies a brand-new data enhancement technology in the training stage. In the test stage, the data is first passed through the filter bank, and then input into multiple convolutional neural network branches to perform multi-scale convolution operations, which reduces the over-fitting phenomenon and improves the classification. Accuracy.

Description

technical field [0001] The invention relates to the technical field of brain-computer interface, in particular to a scalp EEG feature extraction and classification method based on an end-to-end convolutional neural network. Background technique [0002] A brain-computer interface is a communication system that helps people use their brains to control and use external devices without the involvement of peripheral nerves and muscles. This brings hope to those patients with normal thinking but severe neuromuscular damage. The brain-computer interface can regain the ability to move or communicate with the environment and improve the quality of life of patients. Although the original purpose of the brain-computer interface system was to provide a new way of communication for patients with complete paralysis, it has shown great potential in the field of entertainment games in recent years. There have been many studies on the application of brain-computer interfaces to some popular...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00
CPCG06F2218/12
Inventor 周军代光海
Owner 周军
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products