Emotion recognition method, device and apparatus, and storage medium

An emotion recognition and emotion feature technology, applied in character and pattern recognition, special data processing applications, speech analysis, etc., can solve the problems of poor controllability of results, high cost, poor versatility, etc., to simplify the sample training process and improve accuracy. sexual effect

Active Publication Date: 2018-12-11
BEIJING BAIDU NETCOM SCI & TECH CO LTD
View PDF6 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Since the same word has different meanings and expresses different emotional states in different scenarios, the above method is less versatile; in addition, it also needs to rely on manual operations to collect a large amount of data, which is costly and the results are less controllable

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Emotion recognition method, device and apparatus, and storage medium
  • Emotion recognition method, device and apparatus, and storage medium
  • Emotion recognition method, device and apparatus, and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0025] Figure 2A It is a flow chart of an emotion recognition method provided in Embodiment 1 of the present invention, Figure 2B is a schematic diagram of a learning model based on multi-modal feature fusion applicable to the embodiment of the present invention. This embodiment is applicable to the situation of how to accurately identify the user's emotion in the process of multimodal interaction. The method can be executed by the emotion recognition device provided by the embodiment of the present invention, which can be realized by software and / or hardware, and can be integrated into a computing device. see Figure 2A with 2B , the method specifically includes:

[0026] S210. Determine the fusion session feature of the multi-modal session information.

[0027] Among them, modality is a term used in interaction, and multimodality refers to the phenomenon of comprehensively using text, image, video, voice, gesture and other means and symbol carriers to interact. Corre...

Embodiment 2

[0038] image 3 It is a flow chart of an emotion recognition method provided by Embodiment 2 of the present invention. On the basis of Embodiment 1 above, this embodiment further optimizes the fusion conversation features for determining multimodal conversation information. see image 3 , the method specifically includes:

[0039] S310. Determine vector representations of at least two modal conversation information among voice conversation information, text conversation information and image conversation information, respectively.

[0040] Exemplarily, the multimodal session information may include: voice session information, text session information, and image session information. The vector representation of session information refers to the representation of a session information in a vector space, which can be obtained through modeling.

[0041]Specifically, by extracting the characteristic parameters that can represent emotional changes in the voice conversation inform...

Embodiment 3

[0054] Figure 4 It is a structural block diagram of an emotion recognition device provided in Embodiment 3 of the present invention. The device can execute the emotion recognition method provided in any embodiment of the present invention, and has corresponding functional modules and beneficial effects for executing the method. Such as Figure 4 As shown, the device may include:

[0055] A fusion feature determination module 410, configured to determine the fusion session feature of the multimodal session information;

[0056] The emotional feature determination module 420 is configured to input the fusion session features of the multi-modal conversation information into the pre-built multi-modal emotion recognition model to obtain the emotional features of the multi-modal conversation information.

[0057] The technical solution provided by the embodiment of the present invention obtains the fused conversation features by fusing the conversation features of each modality i...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The embodiment of the invention discloses an emotion recognition method, device and apparatus and a storage medium. The method comprises the following steps: determining the fusion session characteristics of the multimodal session information; inputting the fusion session characteristics of the multimodal session information into a multimodal emotion recognition model constructed in advance, and obtaining the emotion characteristics of the multimodal session information. The technical scheme provided by the embodiment of the invention fuses the session features of each mode in the multimodal session information to obtain the fused session features, and inputs the features of the fusion session into a unified multi-modal emotion recognition model for model training, the final emotion results can be predicted directly without training of each modal recognition model separately, and the results of different models can be fused. The sample training process is simplified and the accuracy ofemotion recognition results is improved.

Description

technical field [0001] The embodiments of the present invention relate to the technical field of artificial intelligence, and in particular to an emotion recognition method, device, equipment and storage medium. Background technique [0002] With the development of artificial intelligence, intelligent interaction plays an increasingly important role in more and more fields. In intelligent interaction, an important direction is how to identify the current emotional state of the user in the process of multimodal interaction, so as to provide emotional feedback for the entire intelligent interactive system, and make timely adjustments to deal with users in different emotional states. Improve the service quality of the entire interaction process. [0003] At present, the main emotion recognition methods such as figure 1 As shown, the whole process is as follows: by independently modeling each modality such as speech, text, and facial expression images, and finally merging the ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06F17/27G10L25/63
CPCG10L25/63G06F40/30G06F18/253
Inventor 林英展陈炳金梁一川凌光周超
Owner BEIJING BAIDU NETCOM SCI & TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products