Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Distributed multi-level graph network training method and system

A network training, multi-level technology, applied in neural learning methods, biological neural network models, instruments, etc., can solve the problems of high parameter server load, high algorithm complexity and low efficiency of decentralized training methods, and achieve consistent reduction complexity, speed up training time, and improve training efficiency

Pending Publication Date: 2021-11-12
ZHENGZHOU SEANET TECH CO LTD
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The large-scale graph network training method in the prior art has the problem of low efficiency, especially the problem that the parameter server load of the centralized training method is too high and the algorithm complexity of the decentralized training method is high

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Distributed multi-level graph network training method and system
  • Distributed multi-level graph network training method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0040] A distributed multi-level graph network training method, comprising:

[0041] Step 1), each node of the graph is divided into multi-level networks according to the strength of the connection relationship, using the graph segmentation algorithm, and the upper-level network manages the lower-level network;

[0042] Step 2), performing parallel graph network calculation on the lowermost network in the multi-level network obtained in step 1), and passing the result matrix to the upper management network;

[0043] Step 3), improve the Paxos consensus algorithm for the superior management network obtained in the step 2) to perform node data sharing;

[0044] Step 4), performing information fusion on the shared result matrix obtained in each sub-network of the layer where the upper-level management network involved in step 3), so that each sub-network of this layer can obtain a respective fused result matrix;

[0045] Step 5), uploading each fused result matrix in step 4) to ...

Embodiment 2

[0080] This embodiment provides a distributed multi-level graph network training system, including: a graph network training module and a graph network model generation module; wherein,

[0081] The graph network training module is used to use the graph segmentation algorithm. Based on the strength of the connection relationship between each node of the graph data, the graph data is divided into multi-level networks, in which the bottom layer is the original data; each layer includes multiple sub-networks, each sub-network The management node of the network is located on the adjacent upper layer of the sub-network; the graph network calculation is performed on the bottom layer of the multi-level network, and the result matrix of each sub-network at the bottom layer is uploaded to the management node of the sub-network; Each management node of the matrix performs the processing of the improved Paxos consensus algorithm, so that the relevant nodes in the sub-network to which the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a distributed multi-level graph network training method and system, and the method comprises the steps: dividing graph data into a multi-level network through employing a graph segmentation algorithm; carrying out graph network calculation on a bottom layer sub-network to obtain a result matrix and upload the result matrix to a management node of an upper layer, and carrying out Paxos consistency algorithm improvement processing on the management node to realize sharing of the result matrix of the layer; performing information fusion on the shared result matrix to obtain a fused result matrix, and uploading the fused result matrix until the root node of the top layer obtains the fused result matrix; issuing the fused top-layer result matrix layer by layer from the root node until the bottom layer, and completing one-time graph network training; inputting the fused result matrix of the root nodes into a bottom layer as a feature matrix of next training, and repeating the steps until a preset number of training times is reached to obtain a trained multi-level network; and connecting the trained multi-level network with an output layer to obtain a trained graph network model.

Description

technical field [0001] The invention relates to a distributed machine learning method, in particular to a distributed multi-level graph network training method and system. Background technique [0002] With the increasing application of Internet and big data research, deep learning methods have achieved considerable results in the fields of speech recognition, image recognition, machine translation, etc. Computing technology is also relatively mature. However, in the data analysis and processing of the graph structure, due to the complex non-Euclidean geometric nature of the graph structure itself, and the characteristics of strong data correlation, difficult segmentation, and difficult parallel computing, the training of large-scale graph neural networks is facing huge challenges. . [0003] Existing distributed neural network training methods mainly use parameter servers to upload and distribute training parameters. Since data such as images and voices have the characte...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/04G06N3/08G06N20/00
CPCG06N3/08G06N20/00G06N3/045
Inventor 盛益强李昊天党寿江
Owner ZHENGZHOU SEANET TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products