Unlock instant, AI-driven research and patent intelligence for your innovation.

A Relation Extraction Method Based on the Combination of Attention Mechanism and Graph Long Short-term Memory Neural Network

A long-short-term memory and relational extraction technology, applied in neural learning methods, biological neural network models, neural architectures, etc., can solve problems such as loss, inability to extract timing information well, and error accumulation information to reduce model performance. effect of influence

Active Publication Date: 2022-05-27
CHINA UNIV OF MINING & TECH
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The dependency relationship model is mainly to construct a relationship extraction model by evolving the convolutional neural network or long-short-term memory neural network into a graph-structured or tree-structured neural network. Among them, the graph convolutional neural network is the most widely used, and the graph convolutional neural network can be very It is good to learn the information of graph structure data, but it is difficult to effectively deal with time series data
This means that for text data with time series characteristics, only relying on the graph convolutional neural network cannot extract the time series information in the text very well.
In addition, the traditional dependency model is completely dependent on the syntactic dependency tree. If the syntactic dependency tree is parsed incorrectly or useful information is deleted during the parsing process, errors will accumulate and information will be lost.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Relation Extraction Method Based on the Combination of Attention Mechanism and Graph Long Short-term Memory Neural Network
  • A Relation Extraction Method Based on the Combination of Attention Mechanism and Graph Long Short-term Memory Neural Network
  • A Relation Extraction Method Based on the Combination of Attention Mechanism and Graph Long Short-term Memory Neural Network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0069] The technical solutions of the present invention will be further described below with reference to the accompanying drawings and embodiments.

[0070] A relation extraction method based on the combination of attention mechanism and graph long and short-term memory neural network according to the present invention, such as figure 1 The specific steps of relation extraction are as follows:

[0071] Step 1: Obtain a relation extraction data set, preprocess the text data in the data set, and generate a word vector matrix used for feature extraction of sentence time series context information and an adjacency matrix used for sentence structure information feature extraction.

[0072] This example uses the TACRED dataset and the Semeval-2010-task8 dataset, where the TACRED dataset includes 68,124 training sets, 22,631 validation sets, and 15,509 test sets, with a total of 41 relation types and a special relation type (no relation ). The Semeval-2010-task8 dataset contains 8...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a relation extraction method based on the combination of an attention mechanism and a graph long-short-term memory neural network. It includes the following steps: extract the context information in the sentence through BiLSTM, and introduce entity position information and entity label information to expand the word vector feature; use the Stanford Parser tool to extract the sentence dependency structure tree to generate the initial sentence structure matrix, and introduce the attention mechanism to Perform attention calculation on the initial sentence structure matrix to obtain the weight information of the structure matrix in the sentence; take the extracted sentence context information and sentence structure weight information as input, and use the relationship extraction based on the combination of attention mechanism and graph long-term short-term memory neural network The model performs relationship extraction on the input, and finally obtains the triple information of the entity. The method of the present invention is evaluated on the TACRED data set and the Semeval2010 task‑8 data set, and the performance of the model is better than the current mainstream deep learning extraction model.

Description

technical field [0001] The invention relates to the technical field of relation extraction in natural language processing, in particular to a relation extraction method based on the combination of an attention mechanism and a graph long and short-term memory neural network. Background technique [0002] With the advent of the era of artificial intelligence and big data, the information on the Internet grows faster and faster. How to efficiently and quickly extract effective information from unstructured text is the focus of scholars' research. Text information extraction includes entity extraction, relation extraction, event extraction, causal extraction, etc. Relation extraction is an important subtask of text information extraction. Relation extraction refers to extracting triple information between entity pairs from unstructured text, namely <entity 1, entity 2, relationship>. As a common and important sub-task in natural language processing, relation extraction ha...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F40/295G06F40/284G06F40/211G06F16/31G06F16/35G06N3/04G06N3/08
CPCG06F40/295G06F40/284G06F40/211G06F16/322G06F16/355G06N3/049G06N3/08G06N3/045
Inventor 张勇高大林巩敦卫郭一楠孙晓燕
Owner CHINA UNIV OF MINING & TECH