Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Emotion recognition method based on bidirectional gating circulation unit network and novel network initialization

A recurrent unit and emotion recognition technology, applied in the field of emotion recognition, can solve problems such as gradient explosion, network model parameters cannot learn the best useful information, gradient dissipation, etc., to improve robustness, overcome long-term dependence, and improve accuracy Effect

Active Publication Date: 2019-08-20
ZHEJIANG UNIV OF TECH
View PDF6 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005]In order to overcome the long-term dependence in the training process of the existing emotion recognition method based on emotional time context information, it is easy to have gradient dissipation or gradient explosion in the backpropagation process problem, and the problem that the default network model parameters cannot learn the best useful information in the training process, the present invention proposes an emotion recognition method based on a bidirectional gated recurrent unit (Bi-GRU) network and novel network initialization, the method It can overcome the long-term dependence problem, and optimize the initialization parameters of the deep ReLU network model, improve the robustness of the bidirectional gated recurrent unit network in training, and improve the accuracy of emotion recognition based on emotional time context information

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Emotion recognition method based on bidirectional gating circulation unit network and novel network initialization
  • Emotion recognition method based on bidirectional gating circulation unit network and novel network initialization
  • Emotion recognition method based on bidirectional gating circulation unit network and novel network initialization

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033]The present invention will be further described below in conjunction with drawings and embodiments.

[0034] refer to figure 1 and figure 2 , an emotion recognition method based on bidirectional gated recurrent unit network and novel network initialization, comprising the following steps:

[0035] Step 1, extracting high-dimensional features of the three modalities of text, vision and audio;

[0036] Extract text features as where T l is the number of words in the opinion speech video, in this embodiment, T l =20, l t Represents the 300-dimensional Glove word embedding vector feature; use the FACET facial expression analysis framework to extract the FACET visual feature as v={v 1 ,v 2 ,v 3 ,...,v Tv}, where T v is the total number of frames of the video, and the p visual features extracted at the jth frame are In this embodiment, p=46; use the COVAREP acoustic analysis framework to extract the COVAREP audio features as Among them, T a is the segmented fr...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an emotion recognition method based on a bidirectional gating circulation unit network and novel network initialization. The emotion recognition method includes the steps: extracting high-dimensional features of three modes of text, vision and audio, and aligning according to the word level; performing normalization processing, inputting the data into a bidirectional gatingcirculation unit network for training; adopting a network initialization method to initialize the weights of the bidirectional gating circulation unit network and the full connection network at the initial training stage of each modal network; performing feature extraction on the state information output by the bidirectional gating circulation unit network by adopting a maximum pooling layer andan average pooling layer; and splicing the two pooled feature vectors to serve as input features of the full connection network, and inputting a to-be-identified text, vision and audio into the trained bidirectional gating circulation unit network of each mode to obtain emotion intensity output of each mode. According to the emotion recognition method, the problem of long-term dependence can be solved; the robustness of the bidirectional gating circulation unit network in training is improved; and the emotion recognition accuracy based on the emotion time context information is improved.

Description

technical field [0001] The present invention relates to the fields of text processing, audio processing, visual processing, feature extraction, deep learning, cyclic neural network, emotion recognition, etc., and in particular relates to an emotion recognition method. Background technique [0002] Emotion recognition is a research hotspot in the field of natural language processing. The main challenge of emotion recognition is to be able to conduct continuous and real-time analysis of the speaker's emotion. There are many ways to model emotion from a language perspective, including focusing on fixed words with opinions, N-gram language models, emotion composition and dependency-based analysis, and distributional representations of emotions. Audio- and visual-based emotion recognition is closely related to multimodal sentiment analysis. Both audio and visual features have been shown to be useful in emotion recognition, and the joint use of facial expression and audio informa...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F17/27G06N3/08G06N3/04
CPCG06N3/084G06F40/30G06N3/044
Inventor 宦若虹鲍晟霖葛罗棋谢超杰
Owner ZHEJIANG UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products