Emotion electroencephalogram signal classification method based on cross-connection type convolutional neural network

A convolutional neural network and EEG signal technology, applied in the field of deep learning classification of emotional EEG signals, can solve the problem of less recognition of EEG signals

Inactive Publication Date: 2020-02-11
HANGZHOU DIANZI UNIV
View PDF9 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

At present, there are few studies on the use of deep learning for EEG signal recognition at home and abroad, and there is a lot of research space

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Emotion electroencephalogram signal classification method based on cross-connection type convolutional neural network
  • Emotion electroencephalogram signal classification method based on cross-connection type convolutional neural network
  • Emotion electroencephalogram signal classification method based on cross-connection type convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0052] Such as figure 1 As shown, this example includes the following steps:

[0053] Step 1: Obtain the EEG signals when humans show different emotions and perform low-pass filtering processing. The specific process is:

[0054] (1) Record the EEG signals and peripheral physiological signals of 32 subjects after watching 40 1-minute short music videos. The dataset is divided into 9 labels: Depressed, Calm, Relaxed, Sad, Peaceful, Joyful, Distressed, Excited, Satisfied. The dimension of each piece of data is 40×40×8064, the first 40 represents 40 different music videos watched by each subject, the second 40 represents the number of EEG channels, and 8064 is the recorded EEG signal data Points, each piece of data has a corresponding emotional label, represented by Arabic numerals 0-8.

[0055] (2) A low-pass filter is used to perform low-pass filtering of 0-30 Hz on the EEG signal to remove noise interference in the high-frequency band.

[0056] Step 2: Input the EEG signal...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a bridged convolutional neural network-based emotion electroencephalogram signal classification method, which comprises the following steps of: firstly, extracting electroencephalogram signal bottom-layer features by using a first convolutional layer of V3, taking the electroencephalogram signal bottom-layer features as the input of V1, and inputting the V1 into a third convolutional layer to extract middle-layer features after the V1 is down-sampled by a second pooling layer; and the middle-layer feature is used as the input of V2, is down-sampled by the fourth poolinglayer of V3 and then is input to the fifth convolution layer of V3 to extract the high-layer feature. And then, the three layers of features are respectively subjected to dimensionality reduction andthen are input into an eighth full-connection layer of the V3 for fusion, and finally, the three layers of features enter a Softmax layer for classification. And comparing the classification result with the actual label, calculating a loss value, and then updating the convolution kernel and the connection weight by using a back propagation algorithm. According to the method, the electroencephalogram signal classification accuracy can be high, and the recognition result is superior to that of a traditional machine learning method and a traditional CNN model.

Description

technical field [0001] The invention belongs to the field of deep learning, and relates to a deep learning classification method based on EEG, in particular to a deep learning classification method for emotional EEG signals. Background technique [0002] Emotion recognition is a research direction that is receiving increasing attention in the field of artificial intelligence. It mainly includes facial expression, voice, physiological pattern, text and physiological signal recognition, among which electroencephalogram (EEG) is the most informative physiological signal. , has received increasing research attention. When we use traditional feature-based machine learning classification methods, the effect of classification will depend on the quality of the selected features. However, due to the characteristics of non-stationarity and large inter-individual differences in the EEG signal itself, it is difficult to find a unified representative feature, which makes the accuracy of...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/084G06N3/045G06F2218/08G06F2218/12G06F18/253
Inventor 孙紫阳席旭刚邱宇晗华仙刘晓云姜文俊
Owner HANGZHOU DIANZI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products