Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Emotion recognition method and device based on multi-modal emotion model

An emotion recognition and multi-modal technology, applied in character and pattern recognition, neural learning methods, biological neural network models, etc., can solve problems such as being susceptible to interference, incomplete single-modal information, and untimely feedback, and achieve information full effect

Pending Publication Date: 2019-07-19
WUYI UNIV
View PDF13 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Human emotions are expressed through multiple modalities, such as expressions, gestures, voices, and words; emotional judgments can be made based on these modalities, but single-modality has many disadvantages, such as incomplete information, untimely feedback, and susceptibility to interference. shortcoming

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Emotion recognition method and device based on multi-modal emotion model
  • Emotion recognition method and device based on multi-modal emotion model
  • Emotion recognition method and device based on multi-modal emotion model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0022] refer to figure 1 , the first aspect of the present invention provides a kind of emotion recognition method based on multimodal emotion model, is characterized in that, comprises the following steps:

[0023] S1. Establish a basic dimension prediction model;

[0024] S2. Marking the basic dimension prediction model and combining the neural network model to respectively form a video dimension prediction model, an audio dimension prediction model and a text dimension prediction model;

[0025] S3, extracting the facial expression gesture video features, audio features and speech text features of the target object;

[0026] S4. Obtain the first emotion result by analyzing the facial expression and posture video features through the video dimension prediction model;

[0027] S5. Obtain a second emotion result by analyzing the audio features through the audio dimension prediction model;

[0028] S6. Obtain a third emotion result by analyzing the characteristics of the utt...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an emotion recognition method and device based on a multi-modal emotion model. The method comprises the steps of establishing a basic dimension prediction model, obtaining a video dimension prediction model, an audio dimension prediction model and a text dimension prediction model through the basic dimension prediction model, and analyzing the expression posture video features, audio features and utterance text features respectively to obtain a first emotion result, a second emotion result and a third emotion result; fusing the three results and obtaining an emotion category of the target object by combining a mapping relation based on a basic dimension prediction model. The method and the device carry out the emotion recognition from multiple modes and multiple angles, so that the method and the device have the advantages of being comprehensive in information, strong in anti-interference performance and high in accuracy.

Description

technical field [0001] The invention relates to the field of computer processing, in particular to an emotion recognition method and device based on a multimodal emotion model. Background technique [0002] Emotion is a psychological phenomenon that people show in their daily life. For intelligent machines, if they can quickly and accurately judge people's emotional state, they can further understand the user's emotion, so as to realize the natural, friendly and harmonious interaction with the user. Human emotions are expressed through multiple modalities, such as expressions, gestures, voices, and words; emotional judgments can be made based on these modalities, but single-modality has many disadvantages, such as incomplete information, untimely feedback, and susceptibility to interference. shortcoming. Contents of the invention [0003] In order to solve the above problems, the purpose of the embodiments of the present invention is to provide an emotion recognition met...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06N3/045G06N3/044G06F18/241G06F18/256
Inventor 翟懿奎邓文博徐颖柯琪锐曹鹤甘俊英应自炉曾军英秦传波麦超云
Owner WUYI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products