A visual-auditory evoked emotion recognition method and system based on an EEG signal

An EEG signal and emotion recognition technology, applied in medical science, psychological devices, sensors, etc., can solve the problem of low accuracy of emotion recognition, and achieve the goal of improving classification accuracy, eliminating redundant features, and improving accuracy. Effect

Active Publication Date: 2019-01-15
WUHAN UNIV OF TECH
View PDF5 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] In view of this, the present invention provides an audio-visual-induced emotion recognition method and system based on EEG signals t

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A visual-auditory evoked emotion recognition method and system based on an EEG signal
  • A visual-auditory evoked emotion recognition method and system based on an EEG signal
  • A visual-auditory evoked emotion recognition method and system based on an EEG signal

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0040] This embodiment provides an audio-visual-induced emotion recognition method based on EEG signals, please refer to figure 1 , the method includes:

[0041] First, step S1 is performed: collecting EEG signals generated based on audio-visual induction.

[0042] Specifically, electroencephalogram signal (EEG) is the overall reflection of the electrophysiological activity of brain nerve cells on the surface of the cerebral cortex or scalp. In this embodiment, the generation of EEG signals is induced by auditory stimulation and visual stimulation. EEG signal collection such as image 3 As shown, the schematic diagram of the signal acquisition instrument and the EEG cap.

[0043] Then step S2 is performed: performing preprocessing through a band-pass filter to obtain multi-channel EEG signals.

[0044] Specifically, the EEG signal can be filtered at 4-45HZ through a band-pass filter, which can eliminate clutter and interference in other frequency bands, improve the signal-...

Embodiment 2

[0122] This embodiment provides an audiovisual-induced emotion recognition system based on EEG signals, please refer to Figure 8 , the system consists of:

[0123] A signal collection module 801, configured to collect electroencephalogram signals generated based on audio-visual induction;

[0124] A preprocessing module 802, configured to perform preprocessing through a bandpass filter to obtain multi-channel EEG signals;

[0125] The feature extraction module 803 is used to perform multivariate empirical mode decomposition of non-uniform sampling on the multi-channel EEG signal, and select an effective intrinsic mode function to extract the characteristics of the emotional EEG signal;

[0126] The feature screening module 804 is used to use the sequential floating forward selection method as a search strategy for selecting and deleting feature sets, combine filters and wrappers as the evaluation criteria for the optimal feature subset, and perform an evaluation of the extra...

Embodiment 3

[0136] Based on the same inventive concept, the present application also provides a computer-readable storage medium 900, please refer to Figure 9 , on which a computer program 911 is stored, and the method in Embodiment 1 is implemented when the program is executed.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a visual-auditory evoked emotion recognition method and a system based on an EEG signal. The method comprises the following steps: firstly, the EEG signal generated based on thevisual-auditory evoked emotion is collected and preprocessed through a band-pass filter. Then the multichannel EEG signals are decomposed by multivariate empirical mode decomposition, and the effective intrinsic mode functions are selected to extract the features of emotional EEG signals. Then, the sequence floating forward selection method is used as the search strategy to select and delete feature sets, and the filter and encapsulator are used as the evaluation criteria of the optimal feature subsets to filter the extracted emotional EEG features. Then the selected feature subset is input to support vector machine for classification, and the classification results are obtained. Finally, according to the classification results, the emotion recognition results are obtained, and the emotion recognition is realized. On the basis of exploring the law of emotional electroencephalogram, the invention carries out the research on the emotion recognition method of a plurality of induced modes, and effectively improves the recognition accuracy.

Description

technical field [0001] The invention relates to the technical field of EEG signal processing, in particular to an audiovisual-induced emotion recognition method and system based on EEG signals. Background technique [0002] Automatic emotion recognition through effective means is of great significance to advanced human-computer interaction systems. Fast and accurate emotion recognition can make the human-computer interaction process more friendly and intelligent. Therefore, emotion recognition has become a hot topic in the fields of computer science and artificial intelligence. [0003] Human facial expressions, voice, body and various physiological signals can reflect psychological state and emotional changes to a certain extent, and can be used as signal sources for emotion recognition. However, external characteristics such as human behavioral characteristics, language characteristics and expression characteristics are To a large extent, it can be influenced and controlle...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): A61B5/16A61B5/0484
CPCA61B5/165A61B5/7264A61B5/378A61B5/38
Inventor 陈昆艾青松刘泉何悦
Owner WUHAN UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products