Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Safe decentralized graph federal learning method

A technology of decentralization and learning methods, applied in the direction of neural learning methods, computer security devices, biological neural network models, etc., can solve problems such as difficult guarantees, time-consuming, protection, etc., to protect data privacy and security, and reduce communication time, the effect of alleviating communication bottlenecks

Active Publication Date: 2021-12-10
蓝象智联(杭州)科技有限公司
View PDF2 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The above solution does not protect the local model parameters sent to the central server in the global model aggregation stage to prevent possible information leakage; secondly, the central server responsible for aggregating model information requires a trusted neutral third party Institutions, for modeling between institutions, it is difficult to guarantee such a trusted neutral third party; finally, this centralized architecture puts high demands on the IO capabilities of the central server, and all clients must Waiting for all clients to upload the model parameters to the central server successfully, and then the central server distributes the updated global model parameters to the clients before the client can proceed to the next cycle, which is undoubtedly very time-consuming

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Safe decentralized graph federal learning method
  • Safe decentralized graph federal learning method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0041] Embodiment: A safe decentralized graph federated learning method of this embodiment, such as figure 1 shown, including the following steps:

[0042] S1: Number all n clients participating in graph federated learning as 1, 2, 3...n in sequence, and one of the clients serves as the training initiator to initialize the parameters of the graph neural network model and the ring communication topology map, and send them to other clients;

[0043] The ring communication topology diagram is matrix A,

[0044] ,

[0045] , , , 1≤i≤n, 1≤j≤n,

[0046] When i=j, A ij ≠0,

[0047] Among them, A ij Indicates the weight coefficient between the client numbered i and the client numbered j, if A ij ≠0 means that the client numbered i can communicate with the client numbered j, if A ij =0 means that the client numbered i cannot communicate with the client numbered j, matrix A is a symmetrical matrix, A ii Indicates the weight coefficient of the client numbered i, Indica...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a safe decentralized graph federal learning method. The method comprises the following steps of S1, sequentially numbering all clients, initializing graph neural network model parameters and sending a ring communication topological graph to all the clients; S2, enabling each client to determine a first-level neighbor client and a second-level neighbor client according to the ring communication topological graph, and performing key negotiation with each corresponding second-level neighbor client to generate a corresponding shared key; S3, training the local graph neural network model by each client, and updating parameters of the local graph neural network model; S4, enabling each client to receive the graph neural network model parameters sent by the first-level neighbor client to update a local graph neural network model; and S5, repeatedly executing the steps S3 to S4 till that the graph neural network model converges. According to the invention, the data privacy and security of each client can be protected, the communication bottleneck is reduced, and the communication time is shortened.

Description

technical field [0001] The invention relates to the technical field of graph federated learning, in particular to a secure decentralized graph federated learning method. Background technique [0002] In the past few years, the rise and application of neural networks have successfully promoted the research of pattern recognition and data mining. Traditional deep learning methods have achieved great success in extracting Euclidean space data features, but many of the data in the actual scene are generated from non-Euclidean spaces, and the performance of deep learning methods on such data is unsatisfactory. , as shown in the figure, the number of neighbor nodes of each node in the network is not fixed, resulting in some important operations (such as convolution) are easy to calculate on the image, but it is no longer suitable for direct use in the graph. Moreover, deep learning is based on the assumption that the training data satisfies independent and identical distribution,...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F21/60G06F21/62G06N3/04G06N3/08
CPCG06F21/602G06F21/6245G06N3/04G06N3/08
Inventor 裴阳刘洋毛仁歆徐时峰朱振超
Owner 蓝象智联(杭州)科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products