Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method and system for expanding stand-alone graph neural network training to distributed training, and medium

A kind of neural network training and neural network technology, which is applied in the field of expanding stand-alone graph neural network training to distributed training, can solve the problems that it is difficult for users to verify the correctness of backpropagation logic, complexity, error-prone backpropagation logic, etc. Achieve the effect of easy to use and use, low learning cost, training accuracy and speed guarantee of parameter convergence

Active Publication Date: 2020-12-29
SHANGHAI JIAO TONG UNIV
View PDF5 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although this method can also use the same computing logic as a single server to train large-scale graph data, so as to ensure the accuracy of model training and the convergence speed of parameters, it requires users to write forward propagation and reverse propagation of graph computing operations at the same time. The calculation logic of forward propagation, while most of the deep learning frameworks and stand-alone graph neural network frameworks currently adopt the method of automatic backpropagation, which does not require users to write backpropagation steps, because the calculation logic of backpropagation is relatively The calculation logic of direction propagation is more complicated, and it will become more complicated in distributed scenarios. The back propagation logic implemented by the user is prone to errors, and it is difficult for users to verify the correctness of the back propagation logic, so this This solution lacks flexibility, and it is not very convenient to quickly develop and implement the graph neural network model

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and system for expanding stand-alone graph neural network training to distributed training, and medium
  • Method and system for expanding stand-alone graph neural network training to distributed training, and medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0041] The present invention will be further described below in conjunction with accompanying drawing.

[0042] Such as figure 1 As shown, it is the specific flow of the present invention to expand the single-machine graph neural network training to the distributed training method. Combine below figure 2 The disclosed embodiments are described in detail:

[0043] figure 2 There are 5 nodes in the Zhongyuan graph, namely A, B, C, D and E. The connection relationship of the edges between the nodes is shown in the figure. Each node contains a multi-dimensional initial feature tensor and may contain a label. In the embodiment, a total of two servers are used for training.

[0044] In step 1, the data synchronization operation is registered as an operator of the stand-alone graph neural network framework.

[0045] In step 2, modify the code of the stand-alone graph neural network model, and add a line of code that calls the data synchronization operator defined in step 1 bef...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for expanding stand-alone graph neural network training to distributed training. The method, by providing functions of graph division and data synchronization expansion, using a stand-alone graph neural network frame with automatic back propagation, and just adding a small number of data synchronization codes to original stand-alone graph neural network model codes, and then allocating a large graph to multiple servers, can realize distributed large graph training equivalent to stand-alone graph neural network training by the multiple servers under the condition of not modifying the stand-alone graph neural network framework or modifying calculation logic of an original stand-alone graph neural network model. In addition, the invention also discloses a system for expanding stand-alone graph neural network training to distributed training. In addition, the invention discloses a computer readable storage medium storing a computer program.

Description

technical field [0001] The present invention relates to the field of deep learning and graph neural network, in particular to a method for expanding single-machine graph neural network training to distributed training. Background technique [0002] Graph structure data can represent the correlation between data and can be used to describe many problems in real life. Graph neural network (such as GCN, GAT, GraphSage, etc.), a graph-based deep learning method, can be used to predict the type of nodes on the graph, predict the possibility of edges between nodes, etc., and has achieved great success in many fields. had a very good effect. [0003] Stand-alone graph neural network frameworks (for example, DGL and PyG) provide flexible and convenient programming interfaces and have good stand-alone training performance. However, in actual production applications, the scale of the graph is already very large, and the points and edges have reached hundreds of millions or even bill...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/04G06N3/08G06F9/50
CPCG06N3/084G06F9/5066G06N3/045
Inventor 陈榕杨健邦陈海波臧斌宇
Owner SHANGHAI JIAO TONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products