Federal learning method and device

A learning method and federated technology, applied in the field of machine learning, can solve problems such as poisoning attacks and evil, and achieve the effect of preventing adverse effects

Active Publication Date: 2022-02-25
BEIJING UNIV OF POSTS & TELECOMM
View PDF5 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In view of this, the embodiment of the present invention provides a federated learning method and device to eliminate or improve one or more defects in the prior art, and solve the problem of participant malicious and poisoning attack defense in the federated learning process

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Federal learning method and device
  • Federal learning method and device
  • Federal learning method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0073] Before federated learning starts, the server initializes global model parameters, aggregates weights, defines loss functions, initializes training rounds and verification rounds, and sends the initialized model parameters to each client for a certain round of federated training and verification .

[0074] During training rounds, such as figure 2 As shown, the execution steps are as follows:

[0075] Step 1: The server selects multiple clients participating in the training in this round, marks them as training clients, and sends the federated global model to the training clients.

[0076] Step 2: Train the client to train the model with local data, update the model parameters and gradients, and send the trained local model parameters (or gradients, that is, update parameters) to the server.

[0077] Step 3: The server aggregates the local model parameters (or gradients) uploaded by each training client to obtain the updated federated global model for this round.

[0...

Embodiment 2

[0084] The application scenario of this embodiment can be: in the background of federated learning, there are several participants' clients and a server's server, and the server jointly maintains a global model by aggregating the model parameters (or gradients) provided by the participants , in actual scenarios, it is impossible to guarantee that the models uploaded by all participants are reliable, so a node credibility verification method based on client-side interactive verification is proposed.

[0085] Specifically, it can be used when there is a server and clients of K parties , the kth client local storage has data samples , Represented as the i-th training sample of the k-th client, the server initializes the model parameters , defining the loss function as . Perform federated learning for R communication rounds Until the model converges, there are T training rounds whose index is and E verification round indices are ,and , the distribution initi...

Embodiment 3

[0121] In the privacy enhancement framework of homomorphic encryption, the client of the participant and the server of the server jointly establish the public key pk and private key sk based on the homomorphic encryption scheme. The private key sk is kept secret from the cloud server, but known to all study participants. Each participant will establish a TLS / SSL secure channel different from each other to communicate and protect the integrity of the homomorphic ciphertext. Among them, use Represents data encrypted with a public key.

[0122] In the training round, the specific execution steps are as follows:

[0123] Step 1: The server dispatches the model. In the current round r, the server sends the global model of this round For the training client participating in the next round of training .

[0124] Step 2: The client trains, updates, encrypts, and uploads the local model. training client Decrypt with private key sk get , and then use the local data tra...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a federal learning method and device, and the method comprises the steps: adding a verification round in a federal learning process, enabling a server to transmit a model updating parameter returned by a to-be-verified client last time to an auxiliary client, and carrying out the training through the local data of the auxiliary client, and calculating the deviation between the loss value when training of each auxiliary client in the verification round is terminated and the previous round, and if the number of the auxiliary clients with the deviation greater than a set threshold is greater than a set proportion, marking the client to be verified as an abnormal client. According to the method, the abnormal client can be quickly and effectively identified under the condition that each client is not informed, and furthermore, the weight in the model aggregation process is adjusted according to the deviation in each verification round corresponding to the abnormal client, so that the adverse effect of the abnormal client on global model updating is prevented.

Description

technical field [0001] The present invention relates to the technical field of machine learning, in particular to a federated learning method and device. Background technique [0002] As artificial intelligence technologies such as machine learning and deep learning are applied to many big data scenarios, the training of machine learning and deep learning models requires a large amount of high-quality data in practical applications. In big data scenarios, data collection and use are limited. Due to various factors such as legal regulations, industry competition, and user privacy protection awareness. These limitations make the data scattered in different enterprises and organizations, and the data of all parties cannot be directly shared or exchanged, making it impossible to adopt a centralized machine learning model. For the sake of protecting privacy and improving the effect of artificial intelligence models, federated learning technology is adopted. On the premise that t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F30/27G06N20/00G06F111/08
CPCG06F30/27G06N20/00G06F2111/08
Inventor 郭三川张熙陈宗毅
Owner BEIJING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products