Unlock instant, AI-driven research and patent intelligence for your innovation.

Implicit discourse relationship identification method, system and readable storage medium

A technology of relation recognition and discourse, applied in neural learning methods, instruments, natural language translation, etc., can solve the problems of not effectively using the semantic hierarchy, recognition task information sharing barriers, etc.

Active Publication Date: 2020-09-15
EAST CHINA JIAOTONG UNIVERSITY
View PDF10 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] In view of the above situation, it is necessary to solve the obstacles in information sharing between multiple levels of implicit textual relationship recognition tasks in the prior art due to the lack of effective use of the semantic hierarchy and the dependencies between prediction results. The problem

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Implicit discourse relationship identification method, system and readable storage medium
  • Implicit discourse relationship identification method, system and readable storage medium
  • Implicit discourse relationship identification method, system and readable storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0086] In order to solve the above-mentioned technical problems, the present invention proposes an implicit discourse relationship identification method, please refer to figure 2 and image 3 , for the implicit discourse relationship identification method proposed in the first embodiment of the present invention, the method includes the following steps:

[0087] S101. Receive the global semantic relationship vector and the local semantic relationship vector sent by the encoder, and use the global semantic relationship vector as the zeroth hidden state of the GRU network.

[0088] In the present invention, the decoder takes the global semantic relationship vector output by the encoder and local semantic relation vector As input, a sequence of multilevel implicit discourse relations is generated.

[0089] In this step, the global semantic relation vector As the initial state of the GRU network (the zeroth hidden state). Understandably, in this step, it is equivalent t...

Embodiment 2

[0131] It can be understood that before the decoder receives the global semantic relationship vector sent by the encoder, the encoder needs to encode the input sentence first. In this example, we focus on introducing an encoder based on Bi-LSTM (bidirectional long short-term memory network) and bidirectional attention mechanism.

[0132] In this example, see Figure 4 , the specific coding rules include the following steps:

[0133] S201, calculate the word-pair correlation matrix between the input first sentence and the second sentence, and according to the word-pair correlation matrix, perform normalization processing from two directions of row and column respectively to obtain the first weight matrix and Second weight matrix.

[0134] It should be pointed out here that Bi-LSTM (bidirectional long-short-term memory network) is a neural network structure commonly used to learn the semantic representation of sentences, which can encode contextual information into the vector ...

Embodiment 3

[0156] For the encoding of the input sentence, the third embodiment of the present invention also proposes an encoding method based on a Transformer encoder, and its specific implementation is as follows:

[0157] Firstly, the first sentence and the second sentence in the implicit discourse relation instance are organized into a sequence such as “[CLS]+first sentence+[SEP]+second sentence+[SEP]”. Among them, [CLS] is added as a special mark at the beginning of the first sentence, and it is expected that the global semantic information between the first sentence and the second sentence can be gathered here. [SEP] is used for the split marker between the first sentence and the second sentence.

[0158] To further distinguish the first statement from the second statement, the first statement uses vector identifier, the second statement uses Vector logotype. To take advantage of the word order information in a sentence, position vectors are used identification, where m, n...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention proposes an implicit discourse relationship recognition method, system and readable storage medium. The method includes the following steps: receiving the global semantic relationship vector and the local semantic relationship vector sent by the encoder, and using the global semantic relationship vector as a GRU The zeroth hidden state of the network; the hidden state corresponding to the previous level and all the local semantic relationship vectors are calculated according to the attention mechanism to obtain the local semantic relationship information of the current level of discourse relationship; the discourse relationship vector of the previous level The local semantic relationship information corresponding to the current-level discourse relationship is concatenated and input to the current GRU unit to calculate the current-level implicit discourse relationship. The implicit discourse relationship identification method proposed by the present invention can flexibly realize information sharing between multi-level discourse relationships, and realize the dependence between the prediction results of discourse relationships at each level.

Description

technical field [0001] The invention relates to the technical field of natural language processing, in particular to an implicit discourse relationship identification method, system and readable storage medium. Background technique [0002] With the continuous development of science and technology and the continuous improvement of technology, in recent years, the performance of most natural language processing systems incorporating text information has also been significantly improved. Specifically, it includes named entity recognition, extractive text summarization, and machine translation. At present, more and more researchers are exploring how to model and utilize textual information. [0003] Specifically, a discourse refers to a language unit composed of a series of structurally cohesive and semantically coherent sentences. They follow a certain semantic relationship or hierarchical structure and are used to explain a certain aspect of a problem or scene. Discourse s...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F40/58G06F40/30G06F40/211G06F40/126G06F16/35G06F16/332G06N3/04G06N3/08
CPCG06F16/355G06F16/3329G06N3/08G06N3/044G06N3/045
Inventor 邬昌兴俞亮胡超文
Owner EAST CHINA JIAOTONG UNIVERSITY