Text sentiment analysis method based on graph attention network

A technology of emotion analysis and attention, applied in biological neural network models, instruments, electrical digital data processing, etc., can solve problems such as difficulty in expressing text syntax structure, low accuracy of text emotion classification, difficulty in capturing syntax dependencies, etc., to achieve The effect of improving accuracy

Pending Publication Date: 2021-03-26
CENT SOUTH UNIV
0 Cites 13 Cited by

AI-Extracted Technical Summary

Problems solved by technology

[0005] The present invention provides a text sentiment analysis method based on a graph attention network, and its purpose is to solve the problem that traditional sentiment analysis methods are di...
View more

Method used

The text emotion analysis method based on graph attention network described in the above-mentioned embodiment of the present invention uses GRU to model the current state of node, the initialization state of node is input in the GRU model and preserves, and the temporary state of node is The node state is aggregated with the saved node state to obtain the final node state, which improves the convergence of the graph attention network.
The text emotion analysis method based on graph attention network described in the above-mentioned embodiment of the present invention, because there is a sentence in the text data to have the evaluation of multiple aspects, need carry out text emotion analysis operation to different aspects, use Biaffine to rely on The parser obtains the syntactic dependency graph, divides a sentence into a linear sequence of words and transforms it into a graph structu...
View more

Abstract

The invention provides a text sentiment analysis method based on a graph attention network. The text sentiment analysis method comprises the following steps: 1, obtaining a text set and an emotion label set from a Semeval 2014 Task 4 data set; step 2, performing random selection in the text set and the emotion label set in proportion to obtain a training set and a test set; 3, performing syntacticdependency relationship analysis on the sentences in the training set through a Biaffine dependency parser, and constructing a syntactic dependency graph according to the syntactic dependency relationship of the sentences; and step 4, inputting the training set into a BERT pre-training model, and converting words in the training set into word vectors through the BERT pre-training model. Accordingto the invention, the syntactic dependency relationship between sentences is analyzed through the Biaffine dependency analyzer, word vector representation is obtained through the BERT pre-training model, sentiment analysis is conducted on the text through the graph attention network model, the complex syntactic structure in the text is fully utilized, and the text sentiment analysis accuracy is improved.

Application Domain

Character and pattern recognitionNatural language data processing +1

Technology Topic

Graph basedNetwork model +10

Image

  • Text sentiment analysis method based on graph attention network
  • Text sentiment analysis method based on graph attention network
  • Text sentiment analysis method based on graph attention network

Examples

  • Experimental program(1)

Example Embodiment

[0060]In order to make the technical problems, technical solutions and advantages to solve the present invention more clearly, and will be described in detail below with reference to the accompanying drawings and specific examples.
[0061]The present invention is aimed at capturing the syntactic dependencies between the sentences between sentences and difficult syntax structures between sentences and difficult to express the complex syntactic structure in the text, and the textual emotional classification is low, providing a graphic Text emotional analysis method for the network.
[0062]Such asFigure 1 to 3As shown, embodiments of the present invention provide a textual emotion analysis method based on a map-based network, including: Step 1, acquire text collection and emotional label collection from the SEMEVAL 2014 Task 4 data set; Step 2, comparable in text Randomly selected in the emotion label collection, to obtain training sets and test sets; Step 3. Step 3, through the BIAFFine dependent parser to analyze the sentences in the training set, according to the sentence of the sentence, depending on the sentence, the synthesis, step 4, will Training set input BERT pre-training model, converting the words in the training set into words vector; step 5, step 5, step 6, step 6, set the diagram focused network model according to the adjacent matrix; step 7, The word vector is embedded in the corresponding node in the diagram point of force, and the word vector is used as the initialization state of the node; in step 8, the diagram attention network model is updated, and the node and nodes in the network model will The adjacent vector is polymerized according to the attention weight, and the vector sequence of the updated node is obtained, the vector sequence of the node is used as a temporary state of the corresponding node in the graph point; step 9, the initialization of the diagram to the network model node Saving in the state input GRU model, resulting in the saving status of the graph pointing network model node; step 10, the temporary state of the graph pointing network model node and the saving status of the graph pointing network model node are aggregated, getting the picture The final state of the power network model node; Step 11, the final state of the graph pointing network model node is activated by the SoftMax function, obtaining the textual emotion trend; step 12, multi-layer training for the diagram attention network model, constructing a loss function; Step 13, according to the loss function, adjust the attention weight, when the loss function value is less than the minimum value of the recorded loss function, update the loss function value minimum and record the corresponding map, the network model parameters, to obtain the optimal Picture focused network model; step 14, the text is emotionally analyzed by the optimal map of the network model.
[0063]The step 3 specifically includes: through the BIAffine dependent parser to grammarize the sentence in the training set, divide the linear sequence of the word sentence and translate into a graphical structure dependent on the sentence, to obtain the syntax, The words that are modified relationships are connected by syntactics.
[0064]According to the above-described embodiment of the present invention, a textual emotion analysis method based on a scales of the text data is required to perform textual emotion analysis operations in different aspects, using BIAffine dependent parser. The syntax is converted into the map, and the linear sequence of a sentence divided into a linear sequence of a sentence is converted to the graphical structure dependent of its syntax, and the syntax is connected to the words of each aspect, reducing the interference of unrelated information and the meaning between different aspects. Influencing, using an exact description according to a description of the sentence.
[0065]The step 4 specifically includes: inputting a training set into the BERT pre-training model, transformer architecture of the BERT pre-training model converts the words in the training set into words 300.
[0066]According to the above-described embodiment of the present invention, a textual emotional analysis method based on the BERT pre-training model, the semantic information of the words is greatly reflected.
[0067]The step 5 specifically includes: a neighboring matrix having a symmetric relationship according to a node having an asymmetric binary relationship in the figure, when the adjacent matrix is ​​1, there is a corresponding coordinate between the adjacent matrix. The arc is connected, and when the abutment matrix is ​​0, there is no direction arc between the two nodes of the corresponding coordinate in the abutment matrix.
[0068]The step 6 specifically includes: the node of the adjacent matrix as the node of the graph, the node is the side, and the edge of the diagram focused network model is built.
[0069]According to the above-described embodiment of the present invention, a textual emotional analysis method based on a graphic branch network, a node in the figure, and a node in the figure, respectively, and a node in the graph point network model, respectively.
[0070]The step 7 specifically includes: embed the word vector of 300 into the corresponding node corresponding to the diagram point of force, as the initialization state of the node in the graph pointing network model.
[0071]Wherein, the step 8 specifically includes: updating the diagram to focus on the network layer, as follows:
[0072]
[0073]Among them, αIJ Indicates the attention coefficient of node J to I, N represents the number of nodes, and W represents the linear transformation weight matrix applied on each node, The entity vector representing the node i, The entity vector indicating the node J, Entity vector representing node K, NiRepresents the neighbor node of node i;
[0074]The contextual information capture of each node is captured by the multi-point attention mechanism. By aggregating the expression of the nodes around each node in the weighted and form of expression, the calculation results under the K independent focal mechanism are adopted. K is an average of replacement, as shown below:
[0075]
[0076]among them, Indicates the value of the node i update, and k is represented as the kth in multiple attention mechanisms, || indicates the characteristics of multiple attention to the characteristics to splicing, σ represents the activation function, Indicates that Node i pays attention to node J, WkIndicates the linear transformation weight matrix of the input node.
[0077]According to the above-described embodiment of the above-described embodiment of the present invention, the textual emotional analysis method based on the introduction of the multi-attention mechanism to capture the context information to stabilize the learning process.
[0078]The step 9 specifically includes: modeling the current state of the node, set the GRU model, and input the GRU model, combined with the input X of the node T time.tTo update the status of reset doors and control gates, as shown below:
[0079]rt= Σ (wz[ht-1, Xt]) (3)
[0080]zt= Σ (wr[ht-1, Xt]) (4)
[0081]
[0082]Among them, σ represents the sigmod function, acts as a gate control signal from the transformation of 0-1, RtControlled reset of the T time, ZtIndicates the gate control update, Ht-1Indicates the node status of the T-1 time. A candidate set indicating T time, * indicates the product of the matrix, [] indicates that two vectors are connected, and Tanh is represented as a dual-stroke normal.
[0083]Wherein, the step 9 further includes: simultaneously forgetting and selecting memory by the same gating z, the range of the gating signal z is 0-1, the closer the gating signal is closer to 1, and the importance of the data is higher, as follows Down:
[0084]
[0085]Among them, 1-z represents forgetting the door, (1-z) θht-1Indicates some unimportant information in the T-1 time node status, Indicates the choice of important information in the candidate concentration in T.
[0086]The feedforward process of the entire graph pointing network model is represented as:
[0087]Hidel+1= GRU (GAT (Hl), Hl) (7)
[0088]Where Hl+1Indicates the status of the first layer, HlIndicates the status of the Layer Node.
[0089]According to the above embodiment of the present invention, the textual emotion analysis method based on a map, using the GRU to model the current state of the node, and save the node's initialization status input GRU model, and the temporary status of the node and the preservation The node status performs node status aggregation, obtains the final node state, and improves the convergence of the graphical attention network.
[0090]The step 11, the step 12, the step 13, and the step 14 specifically include: model training, using L2Regularization method adjusts the minimize cross entropy loss function training diagram attention network model, by linear transformation to map the status of the target node to the classification space, calculate the probability of the emotion classification K of the target node through the SoftMax function, as shown below:
[0091]
[0092]Where w represents the weight matrix of linear transformation, htIndicates the target node state, b represents the deviation of the linear transformation, y indicates a collection of emotion categories.
[0093]According to the above-described embodiment of the present invention, a textual emotion analysis method based on a map-based network is analyzed by the BIAffine dependent on the syntactic dependency between the sentence and constructs the syntax according to the syntax, according to the syntax, the image is constructed, according to the adjacency Matrix build diagram focused network model, through the BERT pre-training model to embed the word to the word vector, embedded into the graph point network model and the word vector as the initialization state of the graph pointless network model node, update the diagram pointing network model , According to the attention of the node of the graph and the node of the graph and the vector of the node neighborhood, obtain the new vector sequence, and the new vector sequence is used as the temporary state of the graph and the temporary state of the network model node, and the picture attention network model will The initialization status of the node is saved in the GRU model set, resulting in the saving status of the graph point network model node, and the temporary state of the graphic network model node and the saving status of the graph point of the network model node are aggregated. The final state of the graph payment network model node, activates the final state of the diagram attention network model node by the SoftMax function to activate the textual emotion, multi-layer training for the diagram attention network model, build a loss function, adjust the attention weight, Get the optimal map of the network model, through the optimal map of the network model for emotional analysis, the textual emotion analysis method based on the graphic disconnection network, depending on the syntactic dependency between the analyzer between the parser To analyze, through the BERT pre-training model, the word vector is expressed, through the picture attention network model, the text is emotionally analyzed, fully expressing the complex syntactic structure in the text, and improves the accuracy of textual emotion analysis.
[0094]The above is a preferred embodiment of the present invention, and it should be noted that several improvements and moisteners can be made without departing from the principles of the invention, these improvements and moisters are also made without departing from the principles of the present invention. It should be considered as a scope of the present invention.

PUM

no PUM

Description & Claims & Application Information

We can also present the details of the Description, Claims and Application information to help users get a comprehensive understanding of the technical details of the patent, such as background art, summary of invention, brief description of drawings, description of embodiments, and other original content. On the other hand, users can also determine the specific scope of protection of the technology through the list of claims; as well as understand the changes in the life cycle of the technology with the presentation of the patent timeline. Login to view more.

Similar technology patents

Network for formance measuring method

InactiveCN101026504Aimprove accuracy
Owner:HUAWEI TECH CO LTD

Bayonet vehicle image identification method based on image features

InactiveCN103150904Aimprove accuracyfine classification
Owner:SUN YAT SEN UNIV +1

TR309 - portable otoscope video viewer

InactiveUS20050171399A1easily attainableimprove accuracy
Owner:RICH TONY C +1

Classification and recommendation of technical efficacy words

  • improve accuracy

Golf club head with adjustable vibration-absorbing capacity

InactiveUS20050277485A1improve grip comfortimprove accuracy
Owner:FUSHENG IND CO LTD

Direct fabrication of aligners for arch expansion

ActiveUS20170007366A1improve accuracyimproved strength , accuracy
Owner:ALIGN TECH

Stent delivery system with securement and deployment accuracy

ActiveUS7473271B2improve accuracyreduces occurrence and/or severity
Owner:BOSTON SCI SCIMED INC

Method and apparatus for image-based eye tracking for retinal diagnostic or surgery device

Owner:SENSOMOTORIC INSTR FUR INNOVATIVE SENSORIK MBH D B A SENSOMOTORIC INSTR +1

Method for improving an HS-DSCH transport format allocation

InactiveUS20060089104A1improve accuracyincrease benefit
Owner:NOKIA SOLUTIONS & NETWORKS OY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products