Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-modal emotion analysis method and system based on deep learning for acupuncture

A emotion analysis and deep learning technology, applied in the field of emotion recognition, can solve the problems of not considering the influence of feature weights, fixation of different feature weights, EEG, heart rate interference, etc., to achieve the effect of weakening the interference of noise

Active Publication Date: 2020-11-03
INST OF ACUPUNCTURE & MOXIBUSTION CHINA ACADEMY OF CHINESE MEDICAL SCI
View PDF6 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] 1. Existing emotion recognition methods, although non-physiological expressions such as facial expressions and voice are combined with physiological factors such as EEG and heart rate for emotion recognition, but among them, the collection of voice signals will cause interference problems such as EEG and heart rate , which affects the accuracy of emotion recognition
[0009] 2. Existing emotion recognition methods use methods such as PCA and LBP to process eigenvalues. There are random and empirical dimensions for dimension reduction, and at the same time, it will bring problems such as incomplete image edge feature extraction information.
[0010] 3. The existing emotion recognition method adopts linear fusion method for the fusion of feature values, and there are problems such as the immobilization of different feature weights, and the influence of each feature weight caused by personality factors is not considered.
[0011] 4. Existing clinical acupuncture studies on the emotional changes of patients or normal people before and after acupuncture mostly use subjective evaluation methods such as scale evaluation, lacking digital and quantitative indicators

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-modal emotion analysis method and system based on deep learning for acupuncture
  • Multi-modal emotion analysis method and system based on deep learning for acupuncture
  • Multi-modal emotion analysis method and system based on deep learning for acupuncture

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0170] According to a specific embodiment of the present invention, the present invention provides a deep learning-based multimodal emotion analysis method for acupuncture. The present invention will be described through a set of experiments below. In this experiment, 18 times of acupuncture treatments were selected. Face screenshots and EEG signal interception data segments are used for experiments, including the following steps:

[0171] Step 1: Collect facial expression images of patients during acupuncture, process the collected facial expression images, and extract Gabor wavelet features of facial expressions through Gabor wavelet transform;

[0172] Process the patient’s facial expression images at the same time corresponding to the 18-time decision curve library, reduce each picture to a uniform size, and obtain the historical facial expression image library, as shown in the attached Figure 5 shown.

[0173] Extract texture features from the pictures in the above-ment...

Embodiment 2

[0193] The present invention adopts a multi-modal feature fusion method, and uses variable weight sparse linear fusion to perform weighting processing on the extracted features of different modal images, and synthesizes a feature vector. The feature fusion weighting formula is expressed as follows:

[0194] O(x)=γK(x)+(1-γ)F(x) (1)

[0195] Wherein: K(x) represents the feature of EEG curve image;

[0196] F(x) represents facial expression features;

[0197] γ is the empirical coefficient.

[0198] Table 2 shows the results of each single mode and the fusion of these two modes, with Figure 9 Accuracy comparison plots using different modalities are shown.

[0199] Table 2 Classification results of each mode and fusion

[0200]

[0201] From attached Figure 9 It can be seen from the figure that the effect of single-modal facial expression on emotion classification is better than that of EEG signal curve, and the accuracy of emotion classification can be improved through...

Embodiment 3

[0203] In order to further study the complementary features of facial expressions and EEG signal curves, this embodiment analyzes the confusion matrix of each modality, revealing the advantages and disadvantages of each modality.

[0204] attached Figure 10a , Figure 10b and Figure 10c (Each column in each figure represents predicted emotion classification information, and each row represents actual emotion classification information.) A confusion matrix based on facial expression and EEG signal curves is given. Figure 10a For the effect of facial expression on emotion classification, Figure 10b is the effect of EEG signals on emotion classification, Figure 10c The effect of mixing two modalities on emotion classification. The experimental results show that facial expressions and EEG curves have different discriminative powers for emotion recognition. Facial expressions are better for the classification of happy emotions, but both facial expressions and EEG signals a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a multi-modal emotion analysis method and system based on deep learning for acupuncture, and belongs to the technical field of emotion recognition. According to the method, feature extraction is carried out on facial expression features by adopting Gabor wavelet transform, feature extraction and dimension reduction processing are carried out on electroencephalogram signals by adopting UPLBP, then sparse linear fusion is carried out on the facial expression features and the electroencephalogram signal features to fuse the facial expression features and the electroencephalogram signal features into a unified and normalized feature vector, and the feature vector is transformed into a tensor form; training is carried out in a CNN-LSTM network, redundant information is removed and predicted emotion classification information is obtained, and a loss function and a correct rate of the network ar calculated by comparing the predicted emotion classification information with actual emotion classification information. According to the invention, the expression features and the electroencephalogram signal features are fused, redundant information is removed, training isconducted through the CNN-LSTM network, the predicted emotion classification information is obtained, and the emotion recognition accuracy is improved.

Description

technical field [0001] The present invention relates to the technical field of emotion recognition, in particular to a deep learning-based multimodal emotion analysis method and system for acupuncture and moxibustion. Background technique [0002] Emotion recognition is an important branch in the field of image recognition, which is the process of identifying people's emotions based on their external expressions or internal physiological signals. Although emotion is an internal subjective experience, it is always accompanied by some external manifestation, that is, certain behavioral characteristics that can be observed, which will be expressed in people's face, posture and tone of voice. At the same time, in addition to external manifestations, emotions also generate physiological signals related to emotions in the human body. Heart rate and EEG signals are physiological signals caused by emotions. Emotional analysis and recognition involves multiple disciplines, including...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/46G06K9/62G06N3/04
CPCG06N3/049G06V40/168G06V40/174G06V10/449G06V10/467G06N3/044G06N3/045G06F2218/08G06F18/25G06F18/214
Inventor 荣培晶李少源李亮
Owner INST OF ACUPUNCTURE & MOXIBUSTION CHINA ACADEMY OF CHINESE MEDICAL SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products