Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A socio-emotional classification method based on multimodal fusion

A sentiment classification and multi-modal technology, applied in text database clustering/classification, semantic analysis, biological neural network model, etc., to achieve performance improvement and increase accuracy

Inactive Publication Date: 2019-03-22
CHONGQING UNIV OF POSTS & TELECOMM
View PDF4 Cites 52 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

For video-based emotion recognition tasks, few existing inventions use C3D networks for their own research

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A socio-emotional classification method based on multimodal fusion
  • A socio-emotional classification method based on multimodal fusion
  • A socio-emotional classification method based on multimodal fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] The specific implementation of the present invention will be further explained in detail below in conjunction with the accompanying drawings.

[0030] figure 1 It is the model frame diagram of the present invention. Involving audio, visual and text information feature extraction and decision fusion classification.

[0031] (1) Text sentiment classification based on CNN-RNN hybrid model: For text information, use CNN-RNN hybrid model to realize text sentiment analysis. CNN-RNN consists of two parts: Convolutional neural network extracts text features, and recurrent neural network is used for emotion prediction.

[0032] (2) Visual emotion classification based on the 3DCLS model: 3DCLS (3D CNN-ConvLSTM) consists of two parts: a three-dimensional convolutional neural network extracts spatiotemporal features from the input video, and a convolutional LSTM (LongShort-Term Memory) further learns long-term Spatio-temporal features, processing and emotional prediction of the extracte...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a social emotion classification method based on multimodal fusion, which relates to information in the form of audio, visual and text. Most of affective computing research onlyextracts affective information by analyzing single-mode information, ignoring the relationship between information sources. The present invention proposes a 3D CNN ConvLSTM (3D CNN ConvLSTM) model forvideo information, and establishes spatio-temporal information for emotion recognition tasks through a cascade combination of a three-dimensional convolution neural network (C3D) and a convolution long-short-term memory recurrent neural network (ConvLSTM). For text messages, use CNN-RNN hybrid model is used to classify text emotion. Heterogeneous fusion of vision, audio and text is performed by decision-level fusion. The deep space-time feature learned by the invention effectively simulates the visual appearance and the motion information, and after fusing the text and the audio information,the accuracy of the emotion analysis is effectively improved.

Description

Technical field [0001] The present invention relates to a method of social emotion classification based on multimodal fusion, which mainly extracts emotional features from text, visual, and audio information sources, and uses decision-level fusion to perform heterogeneous fusion of vision, audio, and text to generate final emotions Classification results. Background technique [0002] Emotions play a vital role in our daily lives. They help decision-making, learning, and communication in a human-centered environment. Over the past two decades, artificial intelligence researchers have been trying to empower machines to recognize, interpret and express emotions. All these efforts can be attributed to emotional computing, and emotional analysis has also become a new trend in social media, which can effectively help users understand the opinions expressed on different platforms. [0003] In the past few years, text sentiment analysis has made great progress. People are gradually swi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F16/35G06F17/27G06N3/04
CPCG06F40/30G06N3/045
Inventor 徐光侠李伟凤刘俊吴涛王天羿吴佳健
Owner CHONGQING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products