Deep learning-based multimodal emotion analysis method and system for acupuncture and moxibustion

A emotion analysis and deep learning technology, applied in the field of emotion recognition, can solve the problems of not considering the influence of feature weights, incomplete image edge feature extraction information, lack of digital and quantitative indicators, etc., to avoid the interference of human factors, It is beneficial to classify and eliminate the redundant effect of information

Active Publication Date: 2021-10-08
INST OF ACUPUNCTURE & MOXIBUSTION CHINA ACADEMY OF CHINESE MEDICAL SCI
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] 1. Existing emotion recognition methods, although non-physiological expressions such as facial expressions and voice are combined with physiological factors such as EEG and heart rate for emotion recognition, but among them, the collection of voice signals will cause interference problems such as EEG and heart rate , which affects the accuracy of emotion recognition
[0009] 2. Existing emotion recognition methods use methods such as PCA and LBP to process eigenvalues. There are random and empirical dimensions for dimension reduction, and at the same time, it will bring problems such as incomplete image edge feature extraction information.
[0010] 3. The existing emotion recognition method adopts linear fusion method for the fusion of feature values, and there are problems such as the immobilization of different feature weights, and the influence of each feature weight caused by personality factors is not considered.
[0011] 4. Existing clinical acupuncture studies on the emotional changes of patients or normal people before and after acupuncture mostly use subjective evaluation methods such as scale evaluation, lacking digital and quantitative indicators

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Deep learning-based multimodal emotion analysis method and system for acupuncture and moxibustion
  • Deep learning-based multimodal emotion analysis method and system for acupuncture and moxibustion
  • Deep learning-based multimodal emotion analysis method and system for acupuncture and moxibustion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0170] According to a specific embodiment of the present invention, the present invention provides a deep learning-based multimodal emotion analysis method for acupuncture. The present invention will be described through a set of experiments below. In this experiment, 18 times of acupuncture treatments were selected. Face screenshots and EEG signal interception data segments are used for experiments, including the following steps:

[0171] Step 1: Collect facial expression images of patients during acupuncture, process the collected facial expression images, and extract Gabor wavelet features of facial expressions through Gabor wavelet transform;

[0172] Process the patient’s facial expression images at the same time corresponding to the 18-time decision curve library, reduce each picture to a uniform size, and obtain the historical facial expression image library, as shown in the attached Figure 5 shown.

[0173] Extract texture features from the pictures in the above-ment...

Embodiment 2

[0193] The present invention adopts a multi-modal feature fusion method, and uses variable weight sparse linear fusion to perform weighting processing on the extracted features of different modal images, and synthesizes a feature vector. The feature fusion weighting formula is expressed as follows:

[0194] O(x)=γK(x)+(1-γ)F(x) (1)

[0195] Wherein: K(x) represents the feature of EEG curve image;

[0196] F(x) represents facial expression features;

[0197] γ is the empirical coefficient.

[0198] Table 2 shows the results of each single mode and the fusion of these two modes, with Figure 9 Accuracy comparison plots using different modalities are shown.

[0199] Table 2 Classification results of each mode and fusion

[0200]

[0201] From attached Figure 9 It can be seen from the figure that the effect of single-modal facial expression on emotion classification is better than that of EEG signal curve, and the accuracy of emotion classification can be improved through...

Embodiment 3

[0203] In order to further study the complementary features of facial expressions and EEG signal curves, this embodiment analyzes the confusion matrix of each modality, revealing the advantages and disadvantages of each modality.

[0204] attached Figure 10a , Figure 10b with Figure 10c (Each column in each figure represents predicted emotion classification information, and each row represents actual emotion classification information.) A confusion matrix based on facial expressions and EEG signal curves is given. Figure 10a For the effect of facial expression on emotion classification, Figure 10b is the effect of EEG signals on emotion classification, Figure 10c The effect of mixing two modalities on emotion classification. The experimental results show that facial expressions and EEG curves have different discriminative powers for emotion recognition. Facial expressions are better at classifying happy emotions, but both facial expressions and EEG signals are less e...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a deep learning-based multimodal emotion analysis method and system for acupuncture and moxibustion, belonging to the technical field of emotion recognition. The present invention uses Gabor wavelet transform for feature extraction on facial expression features, uses UPLBP to perform feature extraction on EEG signals, performs dimensionality reduction processing, and then uses sparse linear fusion of facial expression features and EEG signal features to form a unified , normalized eigenvectors, and transform eigenvectors into tensor form, input CNN‑LSTM network for training, remove redundant information and obtain predicted emotion classification information, and calculate network loss by comparing predicted emotion classification information with actual emotion classification information function and accuracy. The present invention fuses expression features and EEG signal features, removes redundant information, and obtains predicted emotion classification information through CNN-LSTM network training, thereby improving the accuracy of emotion recognition.

Description

technical field [0001] The present invention relates to the technical field of emotion recognition, in particular to a deep learning-based multimodal emotion analysis method and system for acupuncture and moxibustion. Background technique [0002] Emotion recognition is an important branch in the field of image recognition, which is the process of identifying people's emotions based on their external expressions or internal physiological signals. Although emotion is an internal subjective experience, it is always accompanied by some external manifestation, that is, certain behavioral characteristics that can be observed, which will be expressed in people's face, posture and tone of voice. At the same time, in addition to external manifestations, emotions also generate physiological signals related to emotions in the human body. Heart rate and EEG signals are physiological signals caused by emotions. Emotional analysis and recognition involves multiple disciplines, including...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00G06K9/46G06K9/62G06N3/04
CPCG06N3/049G06V40/168G06V40/174G06V10/449G06V10/467G06N3/044G06N3/045G06F2218/08G06F18/25G06F18/214
Inventor 荣培晶李少源李亮
Owner INST OF ACUPUNCTURE & MOXIBUSTION CHINA ACADEMY OF CHINESE MEDICAL SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products