Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Model training method based on federated learning

A model training and federation technology, applied in the information field, can solve problems such as the inability to obtain the gradient of a single node

Active Publication Date: 2020-04-21
ALIPAY (HANGZHOU) INFORMATION TECH CO LTD
View PDF7 Cites 33 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

It is worth emphasizing that the server is restricted by the SA protocol and cannot obtain the gradient uploaded by a single node

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Model training method based on federated learning
  • Model training method based on federated learning
  • Model training method based on federated learning

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach

[0055] 1. The jth target type node performs gradient calculation according to the model parameter set and local training samples to obtain the gradient w ij * , and operate through differential privacy protection, to w ij * Add data interference term k ij , get w ij .

[0056] 2. The jth target type node adds interference to the model parameter set through differential privacy protection operation, and performs gradient calculation according to the model parameter set after interference and local training samples, and obtains wi j .

[0057] 3. The jth target type node adds interference to the local training samples through the differential privacy protection operation, and performs gradient calculation according to the disturbed local training samples and the model parameter set, and obtains w ij .

[0058] S104: The server acquires

[0059] In the embodiment of this specification, the server can obtain the Moreover, due to the restriction of the SA protocol, the ...

Embodiment approach

[0066] 2.1. The server starts from the Q i Collect at least T from target type nodes i A sub-private key set of a target type node, assemble the collected sub-private key set into a private key, and decrypt

[0067] 2.2. The server will issued to at least T i target type nodes; for the at least T i Each target type node in the target type node, the target type node uses its own sub-private key set to decrypt Obtain the decryption result and upload it to the server; the server is responsible for the at least T i Summarize the decryption results uploaded by target type nodes respectively, and get

[0068] S106: The server is based on Update the collection of model parameters.

[0069] Assuming that in the embodiment of this specification, the learning rate specified for the gradient descent method is α, the total number of samples used in the i-th iteration is d, and the model parameter set is denoted as θ, then the following formula can be used to update θ to obta...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a model training method based on federated learning. In one training iteration, the nodes can achieve noise adding and confusion of the gradient through differential privacy protection operation, and the server can obtain the sum of the gradients after noise adding and confusion to update model parameters.

Description

technical field [0001] The embodiment of this specification relates to the field of information technology, and in particular to a model training method based on federated learning. Background technique [0002] Federated learning (Federated machine learning / Federated Learning) refers to a machine learning framework that can effectively help multiple nodes (which can represent individuals or institutions) jointly train models while meeting the requirements of data privacy protection. [0003] Under the framework of federated learning, the server sends model parameters to multiple nodes, and each node inputs local training samples into the model for a training session. After the training, each node will calculate the gradient based on the training results . Subsequently, the server can calculate the sum of the gradients of each node based on the Secure Aggregation (SA, Secure Aggregation) protocol. It is worth emphasizing that the server is restricted by the SA protocol and...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F21/62G06N20/00
CPCG06F21/6245G06N20/00
Inventor 王力陈超超周俊
Owner ALIPAY (HANGZHOU) INFORMATION TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products