Supercharge Your Innovation With Domain-Expert AI Agents!

Implicit chapter relationship identification method based on interactive Transform of multi-head bidirectional attention

A technology of relation recognition and attention, applied in neural learning methods, biological neural network models, semantic analysis, etc., can solve problems such as insufficient recognition of implicit discourse relations, achieve the effect of enhancing capabilities and expanding the scope of use

Inactive Publication Date: 2021-05-25
TIANJIN UNIV
View PDF6 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although the Transformer with self-attention mechanism (Self-attention) can effectively capture the semantic information of a single text, for tasks based on the interaction between two texts (such as text matching, natural language inference, implicit discourse relationship recognition, etc. ) is not enough

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Implicit chapter relationship identification method based on interactive Transform of multi-head bidirectional attention
  • Implicit chapter relationship identification method based on interactive Transform of multi-head bidirectional attention
  • Implicit chapter relationship identification method based on interactive Transform of multi-head bidirectional attention

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0047] The present invention will be described in further detail below in conjunction with the accompanying drawings and specific embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0048] The implementation method of the present invention is given by taking the Penn Discourse TreeBank (PDTB) data set as an example. For the overall framework of the method, see figure 1 shown. The algorithm flow of the whole system includes the steps of data set preprocessing, obtaining the embedded vector representation of textual arguments, capturing the context information of textual arguments, capturing the interaction information of textual arguments, and predicting textual relations.

[0049] Specific steps are as follows:

[0050] (1) Dataset preprocessing

[0051] The Penn Discourse Treebank (PDTB) is a large-scale corpus annotated on 2,312 Wall Street Journal articles...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an implicit chapter relationship identification method based on interactive Transform of multi-head bidirectional attention, which combines a bidirectional attention mechanism and a multi-head attention mechanism, and introduces Transform in the implicit chapter relationship identification to better mine the internal interaction information of chapter argument pairs. Therefore, the feature vector representation which contains important interaction information and is used for identifying the chapter relationship can be better learned so as to identify the chapter relationship. Interaction information between the two chapter arguments can be effectively captured through Bi-attention. In addition, compared with an existing method, the method can map the chapter argument vector representation into different representation subspaces with multiple attention mechanisms, more comprehensively mine interaction information from different aspects, obtain feature vector representation used for identifying chapter relations, and improve the recognition accuracy of the chapter relation. Finally, the feature vector representation used for identifying the chapter relationship are input into a chapter relationship identification layer for chapter relationship identification.

Description

technical field [0001] The invention relates to the technical field of discourse analysis in natural language processing, in particular to the discourse relationship recognition technology, in particular to the implicit discourse relationship recognition method based on the multi-head bidirectional attention interactive Transformer. Background technique [0002] Discourse analysis is a fundamental task in natural language processing (NLP), which analyzes the underlying relational structure and mines the connections between text units. At present, although great progress has been made on the task of explicit discourse relation recognition involving explicit connectives (such as "because", "but"), due to the lack of discourse connectives (Pitler et al., 2009) [2] , implicit discourse relation identification remains a challenge. Improving the implicit discourse relationship recognition task can help many popular Natural Language Processing (NLP) tasks, such as machine translat...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F16/35G06F40/30G06F40/211G06N3/04G06N3/08
CPCG06F16/35G06F40/30G06F40/211G06N3/08G06N3/044G06N3/045
Inventor 贺瑞芳王建贺迎春朱永凯黄静
Owner TIANJIN UNIV
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More