Multi-head attention memory network for short text sentiment classification

A technology of emotion classification and attention, applied in text database clustering/classification, text database query, biological neural network model, etc., can solve problems such as difficulty in mining short text inline relations, effective coding of emotional semantic structure, etc.

Active Publication Date: 2021-05-11
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF7 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] The purpose of the present invention is to overcome the problems in the prior art that it is difficult to dig deeper inline relationships in short texts and t

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-head attention memory network for short text sentiment classification
  • Multi-head attention memory network for short text sentiment classification
  • Multi-head attention memory network for short text sentiment classification

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0045] Such as figure 1 Shown, in embodiment 1, a kind of multi-head attention memory network that is used for short text emotion classification, network comprises multi-hop memory sub-network, and multi-hop memory sub-network comprises a plurality of sequentially connected independent computing modules, this implementation The example specifically includes two sequentially connected independent computing modules (hop), and the independent computing module includes sequentially connected first multi-head attention coding layer, first linear layer and output layer; the first multi-head attention coding layer According to the input historical information memory and original memory, the first linear layer linearizes the output of the first multi-head attention encoding layer, and the output layer superimposes the output of the first linear layer and the historical information memory to obtain a more accurate High-level abstract data representation. Wherein, the original memory i...

Embodiment 2

[0118] This embodiment has the same inventive concept as Embodiment 1. On the basis of Embodiment 1, a short text emotion classification method based on a multi-head attention memory network is provided. The method specifically includes:

[0119] S01: Obtain the word vector matrix of the short text, convert the word vector matrix into n-gram features and generate a new N-gram feature matrix, model the N-gram feature matrix, and mine the dependencies and relationships of each phrase in the text It hides the meaning and obtains the high-level feature representation of the input text;

[0120] S02: Perform abstract conversion of the n-gram feature sequence, perform multi-head attention calculation on the high-level feature representation, and finally perform linearization processing to obtain historical information memory;

[0121] S03: Perform multi-head attention calculations on historical information memory and N-gram features, and superimpose them after linear processing, and...

Embodiment 3

[0129] This embodiment provides a storage medium, which has the same inventive concept as Embodiment 2, and computer instructions are stored thereon, and when the computer instructions run, the multi-head attention memory network for short text emotion classification in Embodiment 2 is executed. step.

[0130] Based on this understanding, the technical solution of this embodiment is essentially or the part that contributes to the prior art or the part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium. Several instructions are included to make a computer device (which may be a personal computer, a server, or a network device, etc.) execute all or part of the steps of the methods in various embodiments of the present invention. The aforementioned storage medium includes: U disk, mobile hard disk, read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), ma...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a multi-head attention memory network for short text sentiment classification. The network comprises a multi-hop memory sub-network, the multi-hop memory sub-network comprises a plurality of independent calculation modules which are connected in sequence, and each independent calculation module comprises a first multi-head attention coding layer, a first linear layer and an output layer which are connected in sequence. The input of each multi-head attention coding layer in the multi-hop memory sub-network comprises original memory and historical information memory, and the multi-head attention memory network learns more complex and abstract nonlinear features contained in a text through stacking conversion of independent calculation modules with enough hop counts; the emotion semantic structure in the text is effectively coded. Furthermore, the original memory of the input multi-hop memory sub-network is fully interacted by the recursive calculation process of the multi-head attention coding layer, so that the remote dependency relationship between the text features is modeled with more components, and the context emotion semantic relationship with higher level is mined, thereby improving the classification performance of the model.

Description

technical field [0001] The invention relates to the technical field of natural language processing, in particular to a multi-head attention memory network for short text sentiment classification. Background technique [0002] With the rapid development of Internet technology, social networks and e-commerce platforms have become the most important public information distribution centers. Using the huge data to analyze people's emotions and opinions has important social and scientific value. Sentiment analysis is the computational study of people's opinions, emotions, emotions, evaluations, and attitudes toward products, services, organizations, individuals, issues, events, topics, and their attributes. It is a subtask of text classification. Unlike ordinary text classification, sentiment analysis requires higher-level semantic extraction, which is more technically challenging. How to use natural language processing (natural language processing, NLP) technology to carry out s...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F40/126G06F40/30G06F40/211G06F16/33G06F16/35G06N3/04G06N3/08
CPCG06F40/126G06F40/30G06F40/211G06F16/3344G06F16/35G06N3/08G06N3/044G06N3/045
Inventor 李晓瑜邓钰彭宇何子睿雷航
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products