Training method and device of graph neural network

A neural network and training method technology, applied in the computer field, can solve problems such as difficult to guarantee training convergence graph neural network embedding effect, consumption, large computing resources, etc., to reduce calculation time consumption, reduce calculation times, and excellent representation performance Effect

Active Publication Date: 2021-05-07
ALIPAY (HANGZHOU) INFORMATION TECH CO LTD
View PDF9 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, in the training phase of the graph neural network, due to the need to train millions of parameters and the highly non-convex nature of the objective function, conventional training consumes a large amount of computing resources, and it is difficult to guarantee the convergence of the training and the training output. Embedding Effects of Graph Neural Networks

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Training method and device of graph neural network
  • Training method and device of graph neural network
  • Training method and device of graph neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033] Multiple embodiments disclosed in this specification will be described below in conjunction with the accompanying drawings.

[0034] As mentioned earlier, at present, the training of graph neural networks encounters a bottleneck. Specifically, the graph neural network is a deep learning architecture that can run on social networks or other graph-based topological data. It is a generalized neural network based on graph topology. Graph neural networks generally use the underlying relational network graph as a computational graph, and learn neural network primitives to generate single-node embedding vectors by transferring, transforming, and aggregating node feature information on the entire graph. The generated node embedding vectors can be used as the input of the prediction layer and used for node classification or to predict the connection between nodes, and the complete model can be trained in an end-to-end manner.

[0035] Due to the highly non-convex nature of the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The embodiment of the invention provides a training method of a graph neural network. The method comprises the following steps: firstly, obtaining a relational network diagram, wherein the relational network diagram comprises a plurality of object nodes corresponding to a plurality of business objects; next, for each object node, fusing the node feature of the object node with the node feature of the neighbor node of the object node to obtain a fusion feature of the object node, and forming a fusion feature matrix by a plurality of fusion features corresponding to the plurality of object nodes; utilizing the graph neural network to perform graph embedding processing on the relation network graph to obtain a plurality of embedded vectors corresponding to the plurality of object nodes, the graph neural network comprising an activation function, and determining a plurality of prediction results based on the plurality of embedded vectors; determining a product matrix before and after the fusion feature matrix is processed by the activation function; and determining a training gradient of parameters in the graph neural network based on the product matrix, the plurality of prediction results and the service labels, and further updating the parameters in the graph neural network based on the training gradient.

Description

technical field [0001] The embodiments of this specification relate to the field of computer technology, and in particular to a training method and device for a graph neural network. Background technique [0002] The relationship network diagram is a description of the relationship between entities in the real world, and is currently widely used in various computer information processing. Generally, a relational network graph includes a set of nodes and a set of edges. Nodes represent entities in the real world, and edges represent connections between entities in the real world. For example, in a social network, people are entities, and relationships or links between people are edges. [0003] In many cases, it is hoped to analyze the topological characteristics of nodes, edges, etc. in the relational network graph, and extract effective information from them. The computing method to realize this kind of process is called graph computing. Typically, it is desired to repres...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/08G06N3/04
CPCG06N3/08G06N3/04
Inventor 李群伟
Owner ALIPAY (HANGZHOU) INFORMATION TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products