Unlock instant, AI-driven research and patent intelligence for your innovation.

A multi-modal emotion recognition method based on a brain-inspired model

An emotion recognition, multimodal technology, applied in the field of emotion classification and pattern recognition, can solve the problem that the multimodal fusion model is not effectively solved, etc.

Active Publication Date: 2020-02-11
BEIJING UNIV OF TECH
View PDF9 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, how to build a multimodal fusion model has not yet been effectively solved

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A multi-modal emotion recognition method based on a brain-inspired model
  • A multi-modal emotion recognition method based on a brain-inspired model
  • A multi-modal emotion recognition method based on a brain-inspired model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0057] In an exemplary embodiment of the present invention, a classification method for multimodal emotion recognition is provided. figure 1 It is a flowchart of a method for multimodal emotion recognition according to an embodiment of the present invention. Such as figure 1 As shown, the classification method used in this embodiment for multimodal emotion recognition includes:

[0058] Step A: For multi-modal emotional data, define the discrimination index DP to measure the degree of group difference of each feature of each modality, extract the data features with high discrimination index, and then use the principal component analysis method to perform feature analysis. Dimensionality reduction, and finally get the multimodal data feature vector where k=1,2,…,N,F k is the eigenvector of the kth modal data, is the first feature of the kth modal data, is the second feature of the kth modal data, and so on, is the Nth of the kth modal data k features, N is the number...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a multimodal emotion recognition method based on a brain-like model. The method includes: defining a discrimination index to measure the degree of difference between groups of modal features, and extracting primary data features; using principal component analysis to perform feature dimensionality reduction, and obtaining data feature vectors in each modal to design a sub-module structure The brain-like modular neural network realizes the fusion of multi-modal features. The design of the connection structure includes the connection between the internal neurons of each sub-module in the module and the connection between the sub-modules in the module and the connection between the modules. The "winner takes all" strategy determines the category of the sample; based on the Hebbian rule, the weight update rule is designed, including two steps of primary update and secondary update, and the connection weights are updated respectively; the obtained connection weights are used for testing data and classifying The effect is evaluated. The invention improves the accuracy rate of multi-modal emotion recognition.

Description

Technical field: [0001] The invention relates to the fields of emotion classification and pattern recognition, in particular to a multimodal emotion recognition method based on a brain-like model. Background technique: [0002] Multimodal emotion recognition uses the complementarity of various modal data such as expression, voice, eye movement and physiological signals to improve the recognition performance of classifiers, and has become a research hotspot at home and abroad in recent years. Among them, multimodal data fusion has become a challenging key issue in multimodal emotion recognition. At present, the fusion methods of multimodal emotion recognition mainly include feature-based fusion, decision-based fusion and model-based fusion. Since the features of multimodal data are different in time scale and measurement, how to achieve feature-based fusion becomes difficult. Decision-based fusion methods cannot reveal the correlation information between different modal fea...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/10G06V40/15G06F2218/08G06F2218/12G06F18/214G06F18/2411
Inventor 李文静乔俊飞
Owner BEIJING UNIV OF TECH