Emotion recognition method and system based on deep learning model and long-short memory network

An emotion recognition and deep learning technology, which is applied in the recognition of patterns in signals, character and pattern recognition, instruments, etc., can solve the problems of not having the ability to generate visual signals, and not being able to learn EEG signals well, and to improve the accuracy, The effect of reducing subjective factors

Active Publication Date: 2019-01-25
刘仕琪
View PDF3 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

This patented describes a way that uses advanced neural models (ANN) or generative adversarial nets (GNN), which were developed from previous work done before 1990 when Neural Network Theory was proposed. These ANNs learn patterns within large amounts of raw brain electrical activity without being explicitly stated about them. They then predict how different emotions may be experienced over longer periods of time - they create images called electroencephalograms ("EEG"). By analyzing these images, we found that there existed certain relationships between their content and those associated with negative feelings such as sadness or excitement. Overall, our approach allows us to accurately identify various aspects related to mental states like feeling good/fellowing up, moodiness, etc., while also reducing subjectivity issues involved during analysis.

Problems solved by technology

This patented technical problem addressed in this patents relates to improving understanding how humans generate feeling when interacting with their surroundings during physical activities such as driving cars on streets without being influenced from other individuals that may also cause them to feel unhealthy for longer periods due to environmental influences like sunlight exposure.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Emotion recognition method and system based on deep learning model and long-short memory network
  • Emotion recognition method and system based on deep learning model and long-short memory network
  • Emotion recognition method and system based on deep learning model and long-short memory network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0076] First: EEG signal data preprocessing and division of data sets

[0077] Select DEAP as the dataset for training the model. Note that this method is not limited to a specific EEG data set, nor is it limited by the number of EEG channels, the number of emotional categories, and the division method. DEAP is a public multimodal (e.g. EEG, video, etc.) dataset. EEG 32 participants recorded signals from 32 channels and watched 40 videos of 63 s each. EEG data were pre-processed and down-sampled to 128 Hz and 4-45 Hz frequency band ranges. By the same transformation concept, a Fast Fourier Transform (FFT) is applied to the 1-second EEG signal and converted into an image. In this experiment, alpha (8-13 Hz), beta (13-30 Hz) and gamma (30-45 Hz) emerged as frequency bands representing relevant activity in the brain. The next step is transformation using the Azimuthal Equidistant Projection (AEP) and the Clough-Tocher scheme, resulting in three 32x32 pixel images correspondin...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an emotion recognition method and system based on a deep learning model and a long-short memory network, The method comprises the following steps: data preprocessing and data set partitioning of EEG signals are performed to construct a network model, wherein the network model comprises a picture reconstruction model composed of a variational encoder and an emotion recognition model composed of a long-short memory network; the network model comprises an image reconstruction model composed of a variational encoder and a short-long memory network; The objective function isconstructed according to the network model. The network model is trained by training set, and the objective function is optimized by Adam optimizer in neural network, and the trained network model isobtained. Using the cross-test set to cross-test the trained network model, determining the super-parameters of the network model, and obtaining the final network model; and using the final network model to visualize the seed data and perform emotion recognition. The invention relies on the data artificial intelligence method to learn the collected EEG signal space and time complex structure, reduce the subjective factors in the prediction, and improve the prediction accuracy.

Description

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Owner 刘仕琪
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products