Supercharge Your Innovation With Domain-Expert AI Agents!

Large-scale graph deep learning calculation framework based on hierarchical optimization normal form

A technology of deep learning and computing framework, applied in the field of deep learning, can solve problems such as not supporting GNNs well, not being able to express and support graph structures, not supporting effective implementation of graph propagation operators, etc.

Pending Publication Date: 2020-09-11
厦门渊亭信息科技有限公司
View PDF2 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] However, none of the existing solutions can support GNNs well. Existing graphics processing engines usually provide a Gather-Apply-Scatter (GAS) vertex program model, which is unable to express and Neural Network Framework Supporting Graph Structure
Deep learning frameworks are designed as data flow graphs to express neural networks, such as TensorFlow, PyTorch, MxNet, CNTK, etc., but cannot directly support graph propagation models
Furthermore, none of them provide the scalability needed to handle large-scale graphs, nor support efficient implementations of GPU-based graph propagation operators.
The current lack of support for these requirements severely limits the ability to fully exploit the potential of large-scale GNNs, while also presenting enormous challenges for combining DNNs with large graph structures at the system level.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Large-scale graph deep learning calculation framework based on hierarchical optimization normal form
  • Large-scale graph deep learning calculation framework based on hierarchical optimization normal form

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0021] refer to figure 1 and figure 2 , the present invention discloses a large-scale graph deep learning computing framework based on layered optimization paradigm, including: block-based data flow conversion, data flow graph optimization, kernel propagation on GPU, and multi-GPU parallel processing. The implementation of large-scale graph deep learning is as follows:

[0022] 1) Block-based data flow conversion: In order to achieve scalability beyond the physical limitations of the GPU, it divides the graph data with vertices and edges into blocks, and converts the GNN algorithm represented in the model into data with block-granularity operators Streamgraph, which enables block-based parallel stream processing on single or multiple GPUs. First, starting from the Separation stage, the tensor data vertex of each vertex is passed to the adjacent edge to form the edge data, including source and target vertex data. The subsequent UDF_Edge stage calls the parallel computing fu...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a large-scale graph deep learning calculation framework based on a hierarchical optimization normal form, and relates to the technical field of deep learning. Block-based data stream conversion is used for dividing vertex and edge data of a graph into small blocks; optimizing of the data flow diagram is used for generating a scheduling strategy; propagation of the kernel onthe GPU is used for efficient kernel propagation operation; parallel processing of multiple GPUs is used for parallel computing of the multiple GPUs in one server. Large-scale graph neural networks (GNNs) are supported, the model can be simply expressed, and a telescopic and efficient GPUs parallel processing engine is supported. In order to represent recursive calculations of each layer of GNN including graph propagation and deep neural network (DNN) calculations, a neural network (Separation-UDF _ Edge-Aggregation-UDF _ Vertex with Neural Networks, SUAU-NN) vertex program abstraction with the functions of dispersion, side application, aggregation and vertex application is adopted.

Description

technical field [0001] The invention belongs to the technical field of deep learning, and in particular relates to a large-scale graph deep learning computing framework based on a layered optimization paradigm. Background technique [0002] Deep learning in the form of deep neural networks has been successfully applied to many fields, such as speech, vision, natural language processing, etc. In these fields, the representation of the underlying data usually adopts a regular grid structure, which is conducive to hardware acceleration (such as GPU) when a large amount of data is parallelized. Driven by the importance of graph data in social networks, knowledge graphs, bioinformatics, and neuroscience, applying deep learning models to data with irregular graph structures is an emerging trend. Use advanced prediction results in target applications, such as classification, embedding, question answering systems. These graph-based neural networks typically apply neural network mo...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/02G06N3/08
CPCG06N3/02G06N3/08
Inventor 王彬洪万福钱智毅
Owner 厦门渊亭信息科技有限公司
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More