Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Efficient lifelong relationship extraction method and system based on dynamic regularization

A technology of relation extraction and relation, applied in neural learning methods, instruments, biological neural network models, etc., can solve the problems of expensive process, inability to adapt models, and time-consuming

Active Publication Date: 2020-10-23
SICHUAN UNIV
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Existing research on lifelong learning is dedicated to overcoming the phenomenon of catastrophic forgetting, dealing with lifelong learning of neural network models (also known as continuous learning). However, this heuristic method requires storing all previous training data as well as new data to train a completely new model, which is expensive and time-consuming
[0004] Therefore, the goal of lifelong learning in relation extraction is to make the model perform well on a series of tasks, thus avoiding revisiting all previous data at each stage, but most existing methods are designed for a fixed set of relations. designed, they cannot adapt the trained model to the newly added relationship without catastrophically forgetting the previously learned knowledge, the phenomenon of catastrophic forgetting, which refers to the significant performance drop of the model when switching from the old task to the new task
To alleviate the forgetting problem, it has been proposed to use regularization terms to prevent drastic changes in parameter values ​​while still being able to find good solutions for new tasks, or to augment models with episodic memory modules, which have been used in simple image classification datasets. obtained considerable performance gains on , but it turns out that they perform poorly in natural language processing scenarios
In fact, only limited literature discusses lifelong learning on natural language processing tasks such as relation extraction; to fill the gap in this field, Wang, H., 2019 proposed a method to overcome the forgetting problem of relation extraction models , they introduced an explicit alignment model to alleviate the distortion of the sentence embedding space when the model learns new data, and achieved the best performance, but although this method can work effectively, it is very dependent on the use of the alignment model , which introduces more parameters into an already over-parameterized relation extraction model, which leads to an increase in the number of supervisory signals, memory and computational resources required for training

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Efficient lifelong relationship extraction method and system based on dynamic regularization
  • Efficient lifelong relationship extraction method and system based on dynamic regularization
  • Efficient lifelong relationship extraction method and system based on dynamic regularization

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0058] refer to figure 1 , is a schematic structural diagram of an efficient lifelong relationship extraction system based on dynamic regularization in the present invention, specifically, an efficient lifelong relationship extraction system based on dynamic regularization, including:

[0059] The training module 1 is used to receive multiple data sets, and sequentially train the training samples in the data sets through the neural model, each data set corresponds to a task; the training samples include entity-to-sentence and candidate relationship sets, and real relationship labels;

[0060] Regularization module 2, establishes memory blocks to store memory data for the trained data set, and accesses the memory data of all memory blocks during training of the new data set, defines multiple loss functions at the same time, and calculates each loss function in different tasks The regularization factor between;

[0061] In this embodiment, the regularization module 2 includes a...

Embodiment 2

[0080] refer to figure 2 , is a process flow of an efficient lifelong relationship extraction method based on dynamic regularization in the present invention, specifically, an efficient lifelong relationship extraction method based on dynamic regularization, comprising the following steps:

[0081] S400: Receive multiple data sets, use the neural model to sequentially train the training samples in the data sets, each data set corresponds to a task; then execute step S500;

[0082] In this example, from a series of data sets {D 1 ,D 2 ,...,D N}, where each dataset corresponds to a task. The data for task k consists of observations and marker pairs Ideally, if all task data were available at the same time, the model could use them simultaneously for joint training, however, with standard lifelong learning setups, datasets arrive sequentially, so only one of them can be accessed at a time.

[0083] The kth task in this embodiment (ie the kth data set D k ) training sample...

Embodiment 3

[0132] In this example, the effectiveness of the system in Example 1 and the method in Example 2 is verified through experiments. Specifically, the Lifelong FewRel dataset and the Lifelong SimpleQuestions dataset are used for evaluation. The Lifelong FewRel dataset consists of 10 tasks , these tasks are obtained by dividing the FewRel dataset into 10 disjoint clusters, FewRel has a total of 80 relations, so each cluster contains 8 relations, and each sample in the cluster includes a target relation Sentences and a candidate set selected by random sampling; LifelongSimpleQuestions is constructed similarly, consisting of 20 tasks generated from the SimpleQuestions dataset.

[0133] Preferably, ACC is used in this embodiment avg and ACC whole Two metrics are used to evaluate our model. ACC avg Estimates the average test accuracy on observed tasks; ACC whole Evaluate the overall performance of the model on observed and unobserved tasks.

[0134]At the same time, the following...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an efficient lifelong relationship extraction method and system based on dynamic regularization. The method comprises the following steps: receiving a plurality of data sets, employing a neural model to sequentially train training samples in the data sets, and enabling each data set to correspond to one task; establishing memory blocks for the trained data set to store memory data, accessing the memory data of all the memory blocks during training of the new data set, and defining a memory data loss function, a feature loss function and an EWC loss function in order to overcome catastrophic forgetting; establishing a training loss difference model during a continuous training task period, and calculating regularization factors of a feature loss function, a memory data loss function and an EWC loss function respectively; and obtaining an optimal relationship extraction parameter according to the feature loss function and the regularization factor thereof, the memory data loss function and the regularization factor thereof, and the EWC loss function and the regularization factor thereof. According to the method, the accuracy of lifelong relationship extractionis higher, and extra parameters are not introduced.

Description

technical field [0001] The invention belongs to the field of computer natural language processing, and in particular relates to an efficient lifelong relationship extraction method and system based on dynamic regularization. Background technique [0002] Relation extraction aims to identify the relationship facts of paired entities in text, which can be applied to many natural language processing fields, such as knowledge base construction and question answering systems. Compared with traditional methods that focus on manually designed features, today's CNN-based (convolution neural network) or RNN (recurrent neural network) neural network methods have achieved impressive progress in relation extraction, but most neural models use methods that assume a set of predetermined relations, but this assumption does not Always applicable in practical relation extraction scenarios. [0003] Existing research on lifelong learning is dedicated to overcoming the phenomenon of catastrop...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F40/279G06N3/08
CPCG06N3/08G06F40/279
Inventor 琚生根申航杰周刚
Owner SICHUAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products