Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Node representation method based on time sequence diagram neural network and incremental learning method

A neural network and sequence graph technology, applied in the field of incremental learning, can solve problems such as the inability to learn rich information of nodes, the inability to store a large amount of graph snapshot data, and increase

Pending Publication Date: 2021-04-20
NORTHEASTERN UNIV
View PDF0 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

GNN is used to capture structural information, and then RNN is used to capture timing information. However, due to the relatively simple design of these two parts in existing methods, for example, the GNN part is generally a GCN or GAT model, while the RNN part is a traditional LSTM. (Long Short-Term Memory, long-term short-term memory network), which makes it impossible for each part to learn the rich information of nodes
[0004] For sequence diagrams, when new data continues to arrive, on the one hand, due to the continuous increase in the amount of sequence diagram data, it is impossible to store a large amount of graph snapshot data in memory; on the other hand, due to the increase in the amount of graph data, resulting in Increased overhead of re-updating model parameters from scratch

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Node representation method based on time sequence diagram neural network and incremental learning method
  • Node representation method based on time sequence diagram neural network and incremental learning method
  • Node representation method based on time sequence diagram neural network and incremental learning method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0036] In order to facilitate the understanding of the present application, the present application will be described more fully below with reference to the relevant drawings. Preferred embodiments of the application are shown in the accompanying drawings. However, the present application can be embodied in many different forms and is not limited to the embodiments described herein. On the contrary, the purpose of providing these embodiments is to make the disclosure of the application more thorough and comprehensive.

[0037] In the node representation method and the incremental learning method based on the sequence diagram neural network provided by the present invention, for the processing of the sequence diagram, such as figure 1 As shown, first, preprocessing operations are performed on each sequence graph snapshot, including feature dimensionality reduction and graph pooling operations. Next, use each GCN model with a dual attention mechanism in the dual attention netw...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a node representation method based on a sequential graph neural network and an incremental learning method, and belongs to the technical field of graph representation learning. After the preprocessing operation, a GCN model with a double attention mechanism is used for respectively processing the time sequence graph snapshots at different moments, graph convolution calculation is carried out, and the structure embedding representation of any node at any moment is obtained; inputting the structure embedded representation of any node at each moment into a t-GRU time sequence network as a sequence to perform serial calculation, and solving a final embedded representation of any node at any moment; for the new data at the T moment, storing an intermediate result before the T moment; only one GCN model with a double attention mechanism being used for processing incremental graph data at the moment T; and synthesizing the intermediate result and the T moment result into a sequence, and inputting the sequence into a t-GRU time sequence network for serial calculation to obtain embedded representation of any node at the T moment. The method is suitable for various time sequence diagram scenes, node representation information is richer and more accurate, and the model iteration convergence speed is high.

Description

technical field [0001] The invention relates to the technical fields of graph neural network and graph representation learning, in particular to a node representation method and an incremental learning method based on a sequential graph neural network. Background technique [0002] Although traditional deep learning methods have achieved great success in extracting features of Euclidean spatial data, the data in many practical application scenarios are generated from non-Euclidean spaces, which makes traditional deep learning methods ineffective in dealing with non-Euclidean spatial data. The performance is hardly satisfactory. Moreover, due to the complexity of the graph and the irregular nature of the structure, it is difficult for existing deep learning methods to capture the interdependence between graph nodes. In recent years, as people have become more and more interested in the extension of deep learning methods on graphs, driven by the success of many factors, resea...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/04G06N3/08
Inventor 谷峪魏頔宋振于戈
Owner NORTHEASTERN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products