Fine-grain emotion element extracting method based on local information representation

A technology of element extraction and local information, applied in neural learning methods, biological neural network models, instruments, etc., can solve the problems of wrong part-of-speech judgment of phrases, many missing extraction results, and difficulty in judging whether the current word is part of the evaluation object, etc. achieve high accuracy

Inactive Publication Date: 2017-12-26
HARBIN INST OF TECH
View PDF4 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] The purpose of the present invention is to solve the problem that the existing fine-grained emotional element extraction method cannot make good use of the following words when extracting the evaluation object, resulting in incorrect judgment of the part of speech of the phrase, many omissions in the extraction results, and difficulty in judging the current situation. Whether a word is a part of the evaluation object or not, a fine-grained emotional element extraction method based on local information representation is proposed

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Fine-grain emotion element extracting method based on local information representation
  • Fine-grain emotion element extracting method based on local information representation
  • Fine-grain emotion element extracting method based on local information representation

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach 1

[0024] Specific Embodiment 1: The fine-grained emotional element extraction method based on local information representation in this embodiment includes:

[0025] Step 1. Each word in the preset window size is searched for the vector representation of the word feature through the Lookup Table, and the obtained word vectors are respectively input into the LSTM model; and the obtained word vectors are combined into a vector input to the front Feed the neural network model;

[0026] Step 2: Represent the hidden layer features of the LSTM model as h t And the local contextual feature representation h of the feed-forward neural network model lr Perform splicing to obtain the spliced ​​result h con :

[0027] h con =[h t , h lr ]

[0028] Step 3, put h con Send it to the output layer and use the softmax function for label classification to obtain the classification result.

[0029] The method in this paper also treats the evaluation object extraction as a sequence labeling ...

specific Embodiment approach 2

[0034] Specific embodiment two: the difference between this embodiment and specific embodiment one is: in the softmax function, each label calculation result P(y t =k|s,θ) is expressed as:

[0035]

[0036] in, Represents the weight from the last hidden layer to the output layer, k represents a certain label category, K represents all possible label sets, s, θ represent the current sentence and model parameters respectively, y t Indicates the current predicted label result.

[0037] Other steps and parameters are the same as those in Embodiment 1.

specific Embodiment approach 3

[0038] Embodiment 3: This embodiment is different from Embodiment 1 or Embodiment 2 in that: the size of the preset window is 3. For the feed-forward neural network used to learn local information, the word vector input with different window sizes was tested, and it was found that the extraction effect was the best when the window size was 3 (previous word, current word, next word). Therefore, for the local information representation model, the window size is uniformly set to 3.

[0039] Other steps and parameters are the same as those in Embodiment 1 or Embodiment 2.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a fine-grain emotion element extracting method based on local information representation in order to solve the problems that by means of an existing fine-grain emotion element extracting method, when an evaluation object is extracted, the closely following word cannot be well utilized, so that the judgment on the part-of-speech of a phrase is wrong, an extracting result has many omissions, and it is difficult to judge whether the current word is one part of the evaluation object or not. The extracting method comprises the steps that for each word in a preset window size, the vector representation of word features is found through a Lookup Table, and the obtained word vectors are input into an LSTM model respectively; the obtained word vectors are combined into one vector to be input to a feedforward neural network model; the hidden layer feature representation of the LSTM model and the local context feature representation of the feedforward neural network model are merged, and a merged result is obtained; the merged result is input to a output layer and classified with a softmas function as a tag. The fine-grain emotion element extracting method is suitable for a fine-grain emotion element extracting tool.

Description

technical field [0001] The invention relates to fine-grained emotional element extraction, in particular to a fine-grained emotional element extraction method based on local information representation. Background technique [0002] Fine-grained emotional element extraction aims to extract evaluation holders, evaluation objects, and evaluation expressions (such as figure 1 ). Evaluation holders are entities that issue opinions in texts; evaluation expressions refer to subjective expressions in texts that represent emotions, emotions, opinions, or other personal states, usually in the form of adjectives or adjective phrases, such as "pretty", "not very happy" ”; the evaluation object refers to the topic discussed in the text, which is specifically expressed as the object modified by the evaluation expression in the text. [0003] Since most product reviews and social network texts contain clear user ID information, the research on evaluation holder extraction is relatively d...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F17/27G06F17/30G06N3/08
CPCG06F16/35G06F40/284G06F40/289G06N3/084
Inventor 秦兵赵妍妍刘挺袁建华
Owner HARBIN INST OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products