Dependency syntax pre-training model-based chapter-level relation extraction method and system

A technology of dependency syntax and relation extraction, applied in neural learning methods, biological neural network models, instruments, etc., can solve the problem of not considering the position information of two tokens or other related information, so as to enhance attention and improve accuracy Effect

Pending Publication Date: 2022-04-29
DALIAN MARITIME UNIVERSITY
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] (3) Based on the pre-training model method, although the Transformer-based pre-training model can implicitly solve long-distance dependencies, the attention mechanism does not take into account the two tokens when modeling the dependencies between any two tokens. The position information or other relevant information of each token in the input sequence is one of the reasons why the pre-trained model still has great limitations in processing long texts.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Dependency syntax pre-training model-based chapter-level relation extraction method and system
  • Dependency syntax pre-training model-based chapter-level relation extraction method and system
  • Dependency syntax pre-training model-based chapter-level relation extraction method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0044] In order to enable those skilled in the art to better understand the solutions of the present invention, the following will clearly and completely describe the technical solutions in the embodiments of the present invention in conjunction with the drawings in the embodiments of the present invention. Obviously, the described embodiments are only It is an embodiment of a part of the present invention, but not all embodiments. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts shall fall within the protection scope of the present invention.

[0045] It should be noted that the terms "first" and "second" in the description and claims of the present invention and the above drawings are used to distinguish similar objects, but not necessarily used to describe a specific sequence or sequence. It is to be understood that the data so used are interchangeable under appropriate ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a chapter-level relation extraction method and system based on a dependency syntax pre-training model, and relates to the technical field of natural language processing, dependency syntax information is introduced into the pre-training model by adopting direct conversion, double affine conversion and decomposition linear conversion, characteristics of different introduction modes are compared respectively, and a chapter-level relation is extracted. Analyzing the effect of the dependency syntax information on the pre-training model, and exploring the auxiliary effect of the graph structure on the pre-training model; the entity feature representation and the distance feature between the entities are spliced, so that the information of the entities is enriched, and the subsequent chapter-level relation extraction task is facilitated. According to the method, the problem that the existing pre-training model is difficult to process a long-distance text in text-level relation extraction and is weak in dependence is solved, the long text processing capability and the performance precision of the pre-training model in a text-level relation extraction task are improved, and the auxiliary effect of the graph structure on the pre-training model is explored.

Description

technical field [0001] The invention relates to the technical field of natural language processing, in particular to a method and system for extracting text-level relations based on dependency syntax and training models. Background technique [0002] Relational extraction is to extract unknown relational facts from plain text, which is a very important step in text mining. Relation extraction has many critical real-world applications, such as question answering tasks and text analysis tasks. Most relational extractions only focus on relational facts within a single sentence, and many relational facts do not exist only in a single sentence, but are distributed across multiple sentences. Therefore, the task of relation extraction gradually develops to the text level. [0003] At present, the mainstream chapter-level relationship extraction methods can be roughly divided into three categories: [0004] (1), Based on the design of different levels of feature extractors. This...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F40/211G06F40/154G06F40/295G06N3/04G06N3/08
CPCG06F40/211G06F40/154G06F40/295G06N3/04G06N3/08G06N3/048
Inventor 张益嘉杨名楚子漪
Owner DALIAN MARITIME UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products