Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Event time sequence relationship recognition method based on relation graph attention neural network

A time-series relationship, neural network technology, applied in neural learning methods, biological neural network models, neural architectures, etc., can solve the problems of omission and loss, and it is difficult to effectively process long-distance non-local semantic information.

Active Publication Date: 2021-03-16
HANGZHOU DIANZI UNIV
View PDF5 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] The present invention provides an event sequence relationship recognition method based on the relational graph attention neural network, aiming to solve the problems existing in many current methods that it is difficult to effectively process long-distance non-local semantic information and miss and lose some important deep-level hidden semantic information.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Event time sequence relationship recognition method based on relation graph attention neural network
  • Event time sequence relationship recognition method based on relation graph attention neural network
  • Event time sequence relationship recognition method based on relation graph attention neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0035] In order for the skilled person to better understand the present invention, the present invention will be further explained below in conjunction with accompanying drawings and specific examples, and the specific details are as follows:

[0036] The present invention comprises the steps:

[0037] Step1: Timing diagram construction.

[0038] Firstly, semantic dependency analysis is performed on event sentence pairs to obtain two dependency trees. For each dependency tree, find the position of the trigger word, and use the trigger word as the starting point, recursively search its adjacent nodes until the adjacent node of p hops, and keep the searched nodes in this stage, where p is number of recursions.

[0039] In order to strengthen the semantic connection between event sentence pairs and the semantic representation between long-distance participles, some artificially constructed edges were added later. In order to simplify operations and improve computing power, the...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a time sequence relationship recognition method based on a relation graph attention neural network, and the method mainly comprises the following steps: firstly carrying out the dependency analysis of an event sentence, and obtaining a related dependency tree; converting the dependency tree into a time sequence diagram by using a diagram recursion construction strategy; then, using the relation graph attention neural network for updating information of the time sequence graph, and obtaining the hidden state of each node in the time sequence graph; and finally, extracting a hidden state and a sentence representation vector related to the trigger word from the node hidden state set, and putting the hidden state and the sentence representation vector into a softmax function to realize event time sequence relationship identification. According to the invention, long-distance non-local semantic information can be effectively processed, deep hidden information is captured and fused, and the accuracy of event time sequence relation recognition is remarkably improved.

Description

technical field [0001] The invention relates to the field of natural language processing, in particular to a method for identifying temporal sequence relationships of events based on a relationship graph attention neural network. Background technique [0002] Event timing relationship recognition is a challenging natural language processing task at present. It can help us analyze some intricate data information in detail and promote the development of many downstream tasks, such as information retrieval and relationship prediction. The event temporal relationship recognition task aims to explore the temporal relationship between different events in different event sentences, and uses trigger words to represent the mentioned events. The trigger word is usually one or more consecutive verbs in the event sentence. The following is an example taken from the TimeBank-Dense corpus, which describes the event timing relationship "BEFORE", that is, the event "invite" occurs before t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F16/33G06F16/35G06F40/211G06F40/216G06F40/289G06F40/30G06K9/62G06N3/04G06N3/08
CPCG06F16/3344G06F16/3346G06F16/35G06F40/216G06F40/289G06F40/30G06F40/211G06N3/049G06N3/08G06N3/047G06N3/045G06F18/241G06F18/2415
Inventor 徐小良高通
Owner HANGZHOU DIANZI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products