Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Remote supervision relationship extraction method with entity perception based on PCNN model

A technology of relation extraction and remote supervision, applied in neural learning methods, biological neural network models, instruments, etc., can solve the problems of not further exploring the different contributions of the three segments in PCNN, ignoring semantic information, etc.

Active Publication Date: 2020-10-30
海乂知信息科技(南京)有限公司
View PDF3 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, there are still some deficiencies in the above methods that need to be improved
For example, existing methods do not consider the impact of entity pairs and sentence context on word encoding, which may ignore some important semantic information; moreover, the different contributions of the three segments in PCNN to relation classification are not further explored

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Remote supervision relationship extraction method with entity perception based on PCNN model
  • Remote supervision relationship extraction method with entity perception based on PCNN model
  • Remote supervision relationship extraction method with entity perception based on PCNN model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0073] The remote supervised relation extraction task can be briefly described as: given a bag B={s 1 ,s 2 ,...,s m}, each sentence in the bag contains the same entity pair (head entity e f and tail entity e t ), the purpose of relation extraction is to predict the relation y between two entities. According to this definition, the extraction of remote supervision relations in the present invention adopts a novel gated segmental convolutional neural network EA-GPCNN with entity-aware enhancement function, such as figure 1 shown.

[0074] Specifically, it can be summarized as follows:

[0075] S1. For a sentence in a given sentence bag, the input layer uses Google's pre-trained word2vec word vector to map each word in the sentence to a low-dimensional word embedding vector to obtain an input sequence;

[0076] S2. The entity-aware enhanced word representation layer uses a multi-head self-attention mechanism to fuse word embeddings with head and tail entity embeddings and r...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a remote supervision relation extraction method with entity perception based on a PCNN model. The method specifically comprises the following steps: combining word embedding with head entity and tail entity embedding and relative position embedding by using a multi-head self-attention mechanism to generate enhanced word semantic representation of a perceptible entity, whichcan capture semantic dependence between each word and an entity pair; introducing a global door, and combining the enhanced word representation perceived by each entity in the input sentence with theaverage value of the enhanced word representations to form a final word representation input by the PCNN, and in addition, in order to determine a key sentence segment in which the most important relationship classification information appears. According to the method, another gate mechanism is introduced, and different weights are allocated to each sentence segment, so the effect of key sentencesegments in the PCNN is highlighted. Experiments show that the remote supervision relationship extraction method provided by the invention can improve the prediction capability of the remote supervision relationship in the sentence.

Description

technical field [0001] The invention relates to relation extraction in natural language processing and information processing, in particular to a PCNN model-based remote supervision relation extraction method with entity perception, which can be widely used in the automatic generation of knowledge graphs in various fields. Background technique [0002] Relation extraction is one of the key technologies of information extraction. It aims to identify the semantic relationship between entity pairs in a given sentence. The extracted semantic relationship can be applied to downstream tasks such as knowledge base automatic completion and question answering system. For example, in a given sentence "[Trump] e1 is the 45th [President] e2 of the United States", the role of relation extraction is to determine the President of relationship between "Trump" e1 and "United States" e2, and express It is a triplet t=(Trump, President of, United States). [0003] Traditional supervised relat...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F40/211G06F40/295G06F40/30G06N3/04G06N3/08
CPCG06F40/211G06F40/295G06F40/30G06N3/08G06N3/048G06N3/045
Inventor 朱新华温海旭张兰芳
Owner 海乂知信息科技(南京)有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products