Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-task chapter-level event extraction method based on multi-headed self-attention mechanism

A technology of event extraction and attention, applied in neural learning methods, natural language data processing, biological neural network models, etc., can solve the problem of not fully considering the contextual relationship in the text, ignoring the relationship between different sentences, and not being able to cross clauses, etc. problem, to achieve good recognition extraction performance and superior recognition effect

Active Publication Date: 2021-12-07
HARBIN INST OF TECH AT WEIHAI +1
View PDF10 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The present invention solves the problem that most of the existing event extraction technologies stay in the stage of single-sentence event extraction, cannot capture detailed features across clauses, and do not fully consider the relationship between contexts in chapters, and event extraction based on pre-trained models is only applicable to sentences and paragraphs It provides a multi-task chapter-level event extraction method based on the multi-head self-attention mechanism, which solves the problem that most of the existing sentence-level event extraction stays in the single event extraction of the document, ignoring the multiple trigger words in the sentence , ignoring the relationship between different sentences, ignoring the fact that event elements and arguments may exist in different sentences, achieving a breakthrough in transforming sequence labeling problems into machine reading comprehension problems

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-task chapter-level event extraction method based on multi-headed self-attention mechanism
  • Multi-task chapter-level event extraction method based on multi-headed self-attention mechanism
  • Multi-task chapter-level event extraction method based on multi-headed self-attention mechanism

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0051] It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0052] For those skilled in the art, it is understandable that some well-known structures and descriptions thereof may be omitted in the drawings.

[0053] In order to better illustrate this embodiment, the technical solution in this embodiment of the present invention will be clearly and completely described below in conjunction with the drawings in this embodiment of the present invention.

[0054] Such as figure 1 , 2 As shown, this embodiment provides a multi-task chapter-level event extraction method based on a pre-trained language model, which specifically includes the following steps:

[0055] Step 101, according to the design of experts, the general field event types are divided into 5 categories (Action, Change, Possession, Scenario, Sentiment), 168 subcategories (such as: Attack, Bringing, Cost, Departing....

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a multi-task chapter-level event extraction method based on a multi-headed self-attention mechanism. The method comprises the following steps: converting single sentence-level event extraction into chapter-level event extraction of a packaged sentence set; carrying out the word embedding representation by utilizing a pre-trained language model BERT; taking all word embedding and position embedding in a single sentence as input, employing a convolutional neural network model for coding, and capturing the most valuable features in the sentence in combination with a segmented maximum pool strategy; utilizing a multi-head self-attention model to obtain chapter representation and attention weight fused with full-text semantic information; utilizing a classifier to obtain a predicted event type; taking event types as prior information, linking the event types to an input sequence for event element extraction, and extracting all related elements in the sequence by using a pre-training model in combination with a machine reading understanding method. The method can be used for text-level event extraction tasks, and the breakthrough of converting a sequence labeling problem into a machine reading understanding problem is achieved.

Description

technical field [0001] The invention relates to the technical field of natural language processing, in particular to a multi-task article-level event extraction method based on a multi-head self-attention mechanism. Background technique [0002] In today's era, data information is growing exponentially at a geometric level. Relying on the development of Internet technology, a large amount of data is generated every moment, news data is increasing rapidly, entertainment data is increasing rapidly, advertising data is rapidly increasing, and technological data is dramatically increasing. Rapid growth... Now, we have fully entered the era of big data. Such a large amount of data information is in various forms, intricate, difficult to mine and process, and difficult to use and analyze. In order to extract more valuable information from news data, the key is to extract the entities, relationships and events contained in the news text, analyze and predict the relationship betwee...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F40/30G06F16/35G06F40/117G06N3/04G06N3/08
CPCG06F40/30G06F40/117G06F16/35G06N3/08G06N3/047G06N3/045Y02D10/00
Inventor 丁建睿吴明瑞丁卓张立斌
Owner HARBIN INST OF TECH AT WEIHAI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products