Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Relation extraction method based on combination of attention mechanism and graph long-short-term memory neural network

A long-short-term memory and relational extraction technology, applied in neural learning methods, biological neural network models, neural architectures, etc., can solve problems such as difficult to effectively process time-series data, loss, and error accumulation information

Active Publication Date: 2021-01-01
CHINA UNIV OF MINING & TECH
View PDF2 Cites 20 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The dependency relationship model is mainly to construct a relationship extraction model by evolving the convolutional neural network or long-short-term memory neural network into a graph-structured or tree-structured neural network. Among them, the graph convolutional neural network is the most widely used, and the graph convolutional neural network can be very It is good to learn the information of graph structure data, but it is difficult to effectively deal with time series data
This means that for text data with time series characteristics, only relying on the graph convolutional neural network cannot extract the time series information in the text very well.
In addition, the traditional dependency model is completely dependent on the syntactic dependency tree. If the syntactic dependency tree is parsed incorrectly or useful information is deleted during the parsing process, errors will accumulate and information will be lost.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Relation extraction method based on combination of attention mechanism and graph long-short-term memory neural network
  • Relation extraction method based on combination of attention mechanism and graph long-short-term memory neural network
  • Relation extraction method based on combination of attention mechanism and graph long-short-term memory neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0069] The technical solutions of the present invention will be further described below in conjunction with the accompanying drawings and embodiments.

[0070] A method of relation extraction based on the combination of the attention mechanism and the graph long-short-term memory neural network described in the present invention, such as figure 1 As shown, the specific steps of relation extraction are as follows:

[0071] Step 1. Obtain a relational extraction data set, preprocess the text data in the data set, and generate a word vector matrix for feature extraction of sentence temporal context information and an adjacency matrix for feature extraction of sentence structure information.

[0072] This embodiment uses the TACRED dataset and the Semeval-2010-task8 dataset, wherein the TACRED dataset contains 68,124 training sets, 22,631 verification sets, and 15,509 test sets, with a total of 41 relationship types and a special relationship type (no relation ). The Semeval-201...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a relation extraction method based on combination of an attention mechanism and a graph long-short-term memory neural network. The method comprises the following steps of extracting context information in sentences through BiLSTM, and entity position information and entity label information are introduced to expand word vector features; extracting the sentence dependency structure tree through a Stanford Parser tool to generate an initial sentence structure matrix, and introducing an attention mechanism to perform attention calculation on the initial sentence structurematrix to obtain weight information of the structure matrix in the sentence; and taking the extracted sentence context information and the weight information of the sentence structure as input, and performing relationship extraction on the input by using a relationship extraction model based on the combination of an attention mechanism and a graph long-short-term memory neural network to finally obtain triple information of an entity. According to the method, evaluation is carried out on a TACRED data junction and a Semeval2010 task8 data set respectively, and the performance of the model is superior to that of an existing mainstream deep learning extraction model.

Description

technical field [0001] The invention relates to the technical field of relation extraction in natural language processing, in particular to a relation extraction method based on the combination of an attention mechanism and a graph long-short-term memory neural network. Background technique [0002] With the advent of the era of artificial intelligence and big data, information on the Internet is growing faster and faster. How to efficiently and quickly extract effective information from unstructured text is the focus of scholars' research. Text information extraction includes entity extraction, relationship extraction, event extraction, causal extraction, etc. As an important subtask of text information extraction, relation extraction refers to extracting triplet information between entity pairs from unstructured text, namely <entity 1, entity 2, relation>. As a common and important sub-task in natural language processing, relation extraction has been successfully us...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F40/295G06F40/284G06F40/211G06F16/31G06F16/35G06N3/04G06N3/08
CPCG06F40/295G06F40/284G06F40/211G06F16/322G06F16/355G06N3/049G06N3/08G06N3/045
Inventor 张勇高大林巩敦卫郭一楠孙晓燕
Owner CHINA UNIV OF MINING & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products