Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A natural language emotion analysis method based on a deep network

A technology of natural language and sentiment analysis, applied in the field of natural language sentiment analysis based on deep network, can solve the problem of not considering the object word and context semantic connection

Active Publication Date: 2019-03-29
SUN YAT SEN UNIV
View PDF7 Cites 23 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] For the existing mainstream methods based on attention mechanism and memory network, only the positional relationship between the object word and the context is considered when generating the memory sequence, and the semantic connection between the object word and the context is not considered. The present invention Propose a kind of natural language emotion analysis method based on depth network, the technical scheme that the present invention adopts is:

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A natural language emotion analysis method based on a deep network
  • A natural language emotion analysis method based on a deep network
  • A natural language emotion analysis method based on a deep network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0056] Figure 1~3 As shown, a deep network-based natural language sentiment analysis method includes an embedding module, a memory sequence building module, a semantic dependency mask attention module, a context moment emotion learning module, and an output module;

[0057] The embedding module uses an embedding lookup table pre-trained by an unsupervised method to convert words in the corpus into corresponding word vectors; for non-dictionary words that do not exist in the lookup table, a Gaussian distribution is used to randomly initialize the which is randomly transformed into a low-dimensional word embedding;

[0058] The memory sequence construction module converts the embedding sequence obtained by the embedding module into a memory sequence through a bidirectional long-short-term memory unit, and the converted memory sequence can represent where n is the sequence length;

[0059] The semantic dependence mask attention module extracts semantic dependence information ...

Embodiment 2

[0094] In this example,

[0095] Given the sentence "Great food but the service was dreadful!" and the object words "food" and "service".

[0096] Results: In the RAM model, both object words in this sentence were judged as positive sentiment, while the DMMN-SDCM model successfully identified the respective emotional polarities of the two object words.

[0097] Analysis: Since the DMMN-SDCM model replaces the traditional text distance information with semantic dependency information, the model can judge that the context word "dreadful" has a deeper impact on the object word "service" than the context word "Great". The emotional polarity judgment of the object word is more helpful. In addition, due to the introduction of the context moment learning task, the model can learn the relationship between the object words "food" and "service" at the same time when constructing the context memory sequence, that is, the contrastive relationship, so as to construct a more scientific memor...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a natural language emotion analysis method based on a depth network. On the basis of a memory network, semantic dependent information is introduced to guide the execution of theattention mechanism, and the context moment information including the whole emotion information of a sentence is also used to provide background information for the current analysis object word. Thewhole model includes an embedding module, a memory sequence building module, a semantic dependency mask attention module, a context moment affective learning module and an output module. In the model,the semantic dependency information of object words and context obtained from dependency syntax tree is introduced into the memory network, so that the memory sequences of each layer are generated dynamically, which leads to the execution of attention mechanism in the multi-layer module of the memory network. In addition, in order to introduce the whole affective information of a sentence, that is, the relationship information between all the object words in the same sentence, we propose a context-based learning task, which assists the affective analysis of specific object words by multitasking learning.

Description

technical field [0001] The present invention relates to the field of sentiment analysis in computer natural language processing, and more specifically, relates to a deep network-based natural language sentiment analysis method. Background technique [0002] The main goal of the object-level sentiment analysis task is to give the sentiment polarity (such as positive, negative, or neutral) of each evaluation object in the sentence for one or more evaluation objects existing in a given sentence. For example, given the sentence "the price of this restaurant is very cheap, but the service is poor" and the evaluation target "price" and "service", for the evaluation object "price", the sentiment polarity is positive, while for the evaluation object "service" , and the emotional polarity is negative. Obviously, for different object words in the same sentence, the sentiment analysis results may be different. [0003] As the attention mechanism and memory network have achieved good ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F16/35G06N3/04
CPCG06N3/04
Inventor 杨猛林佩勤
Owner SUN YAT SEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products