Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Multimodal Sentiment Analysis Method Based on Multidimensional Attention Fusion Network

A technology that integrates network and sentiment analysis, applied in the field of multi-modal emotional computing, can solve problems such as model overfitting, unfavorable production and living environment, and high labeling costs, so as to reduce labeling costs and avoid model overfitting problems Effect

Active Publication Date: 2022-06-21
HUAZHONG UNIV OF SCI & TECH
View PDF7 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This method is only a simple result integration of the three modalities, and does not take into account the correlation information between the modalities, and it is easy to cause model overfitting due to information redundancy.
Another approach is based on modal labeling alignment, that is, when performing data labeling, the three modalities are forced to align in the time dimension based on text or phonemes, thus ensuring the corresponding relationship of the three modalities in time. Then use the recurrent neural network, convolutional neural network, attention mechanism, and Seq2Seq framework for modal fusion. This method is expensive to label and is not conducive to the actual production and living environment.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Multimodal Sentiment Analysis Method Based on Multidimensional Attention Fusion Network
  • A Multimodal Sentiment Analysis Method Based on Multidimensional Attention Fusion Network
  • A Multimodal Sentiment Analysis Method Based on Multidimensional Attention Fusion Network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0076] In order to make the objectives, technical solutions and advantages of the present invention clearer, the present invention will be further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are only used to explain the present invention, but not to limit the present invention.

[0077] The invention provides a multi-modal emotion analysis method based on a multi-dimensional attention fusion network. The specific process is as follows: figure 1 ,in addition figure 2 is a schematic structural diagram of the multi-dimensional attention fusion network in the embodiment of the present invention, image 3 It is a schematic structural diagram of a cross-modal fusion module in an embodiment of the present invention. The implementation steps of the method of the present invention are as follows:

[0078] 1. Process the multimodal sentiment database and perform feature ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-modal emotion analysis method based on a multi-dimensional attention fusion network, which includes: extracting voice preprocessing features, video preprocessing features, text Preprocessing features; then construct the multi-dimensional attention fusion network for each modality, use the autocorrelation feature extraction module inside the network to extract the first-level autocorrelation feature and the second-level autocorrelation feature, and then combine the three modalities Combining autocorrelation information, using the cross-modal fusion module inside the network to obtain the cross-modal fusion features of the three modalities; then using the second-level autocorrelation features and cross-modal fusion features to combine to obtain modal multi-dimensional features ;Finally, splicing the multi-dimensional features of the modalities, determining the sentiment score, and performing sentiment analysis; sentiment analysis.

Description

technical field [0001] The invention belongs to the field of multi-modal emotion computing, and more particularly, relates to a multi-modal emotion analysis method based on a multi-dimensional attention fusion network. Background technique [0002] Sentiment analysis has many applications in daily life. With the development of big data and multimedia technology, the use of multimodal sentiment analysis technology to analyze the different modalities of voice, video and text of data is more conducive to mining the shallow meaning behind the data. For example, in the return survey, through the comprehensive analysis of the user's voice, face and speech content, the user's satisfaction with the service or product can be known. [0003] At present, the difficulty of multimodal sentiment analysis lies in how to effectively integrate multimodal information, because the acquisition methods of voice, video and text features are completely different. When describing the same content...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06V40/16G06V10/80G06V10/82G06K9/62G06F40/30G10L25/63G06N3/04
CPCG06F40/30G10L25/63G06V40/174G06N3/045G06F18/253
Inventor 冯镔付彦喆王耀平江子文杭浩然李瑞达刘文予
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products