Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Bert-based multi-layer attention mechanism relationship extraction method

A technology of relation extraction and attention, applied in neural learning methods, computer components, biological neural network models, etc., can solve problems such as inability to explain polysemy of words in text, and achieve the effect of improving accuracy

Pending Publication Date: 2022-01-28
SHENZHEN QIANHAI HUANRONG LIANYI INFORMATION TECH SERVICES CO LTD
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The supervised relationship extraction method is currently considered to be a method with better relationship extraction effect. It solves the relationship extraction task as a classification problem, designs effective features according to the training data, thereby learning the classification model, and then uses the trained model to predict relationship, most of the existing supervised relationship extraction models are based on the word2vec pre-training model, which cannot explain the polysemy of words in the text, and most current relationship extraction models cannot make full use of the local features of text information and global features

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Bert-based multi-layer attention mechanism relationship extraction method
  • Bert-based multi-layer attention mechanism relationship extraction method
  • Bert-based multi-layer attention mechanism relationship extraction method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and implementation examples. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0028] see figure 1 , the present invention provides a kind of bert-based multi-layer attention mechanism relation extraction method, comprises the following steps:

[0029] Step S1: Obtain sample data;

[0030] Step S2: Divide the sample data;

[0031] Step S3: replace the entities in the sample data with #;

[0032] Step S4: Obtain the entity and connect the entity and the sentence with $ to form a training sample;

[0033] Step S5: Input the training samples into the BERT language model and the fully connected layer, perform vectorization processing, and obtain the vectorized rep...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to an entity relationship extraction method, and in particular relates to a bert-based multi-layer attention mechanism relationship extraction method. The method comprises the following steps of S1 acquiring sample data; S2 dividing the sample data; S3 replacing entities in the sample data with #; S4 obtaining an entity and connecting the entity with the sentence through $ to form a training sample; S5 inputting the training sample into a BERT language model and a full connection layer, and carrying out vectorization processing to obtain a fusion feature vectorization representation; S6 sequentially adding word attention mechanisms to the training samples, and adding weights to local features of the training samples; S7 adding a sentence-level attention mechanism, and adding a weight to the global feature; and S8 inputting the sentence-level feature vector into a classifier after passing through the full connection layer, and obtaining a classification result.

Description

【Technical field】 [0001] The invention relates to an entity relationship extraction method, in particular to a bert-based multi-layer attention mechanism relationship extraction method. 【Background technique】 [0002] In the natural language processing technology center, the construction of knowledge graph plays a very important role in the development of artificial intelligence. In the construction of knowledge graph, knowledge extraction is the core foundation. Knowledge extraction mainly includes three sub-tasks: entity extraction and relationship extraction. , Event extraction, and entity relationship is the core task and important link of knowledge extraction. The main goal of entity relationship extraction is to identify and determine the specific relationship between entity pairs from natural language text, which provides basic support for intelligent retrieval, semantic analysis, etc., and helps to improve search efficiency. [0003] The supervised relationship extr...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F16/332G06F16/35G06F16/36G06F40/279G06K9/62G06N3/04G06N3/08
CPCG06F16/3329G06F16/35G06F16/367G06F40/279G06N3/08G06N3/045G06F18/214
Inventor 王伟陈加杰孙思明
Owner SHENZHEN QIANHAI HUANRONG LIANYI INFORMATION TECH SERVICES CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products