Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Lightweight federated learning privacy protection method based on decentralized security aggregation

A security aggregation and decentralization technology, which is applied in machine learning, digital data protection, computer security devices, etc., can solve the problems of participants who are unable to afford secret sharing calculations, communication cost global model privacy leakage, etc., to reduce computing overhead, High availability, the effect of avoiding data privacy leakage

Pending Publication Date: 2021-12-17
BEIJING INSTITUTE OF TECHNOLOGYGY
View PDF0 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0013] However, it is difficult for the parties to afford the cost of secret sharing computation and frequent communication, and the global model still faces the risk of privacy leakage
[0014] In summary, the privacy protection of federated learning still faces many challenges

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Lightweight federated learning privacy protection method based on decentralized security aggregation
  • Lightweight federated learning privacy protection method based on decentralized security aggregation
  • Lightweight federated learning privacy protection method based on decentralized security aggregation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0062] This embodiment establishes a cooperation model based on a lightweight privacy-preserving federated learning method based on decentralized security aggregation in the present invention, such as figure 1 shown.

[0063] figure 1 The following decentralized secure aggregation scenario is described: each user holds a local dataset and updates the local model in the FL process. Each user randomly connects multiple edge nodes, the user divides the model parameters, generates carefully constructed global random numbers, and divides the global random numbers through parameter division. Send the split parameters and global nonce to the connected node. Edge nodes provide users with secure decentralized partial model aggregation, receive the divided partial models and perform partial model aggregation. Upload part of the aggregation model to the blockchain ledger for global aggregation, and get a global model covered by global random numbers. The blockchain ledger serves as a...

Embodiment 2

[0084] This embodiment compares the results of the method of the present invention in various scenarios, and verifies that the privacy protection method of the present invention has high training accuracy and efficiency. Compare this instance with existing methods that all aim to preserve data privacy during federated learning. Federated Learning has no privacy protections. HEDL uses HE encryption to protect the privacy of local models in distributed deep learning. DPFed guarantees the privacy of common global models unknown to users by adding DP noise in the global model. PSA and VerifyNet preserve the privacy of local models by overlaying random perturbations. These existing methods are compared with this method to obtain the comparison results of the accuracy and time cost of the training model, as shown in Table 4 and Table 5.

[0085] Table 4 Comparison results of precise accuracy of different methods under different user scales

[0086]

[0087] Table 5 Comparison...

Embodiment 3

[0093] In this embodiment, the results of the method of the present invention are compared in various scenarios, and it is verified that the privacy protection method of the present invention has a function of resisting inference attacks from members. Use member inference attack method to attack five different methods of federated learning, HEDL, DPFed, VerifyNet, PSA, using CIFAR-10 data set (https: / / www.cs.toronto.edu / ~kriz / cifar.html ) data set is used for membership inference attack, and the attack comparison results are shown in Table 6.

[0094] Table 6 Comparison results of different methods against membership inference attacks

[0095]

[0096]From Table 6, we can see that traditional FL, VerifyNet and PSA cannot defend against membership inference attacks, because the central server can still expose the global model. In HEDL, the server can only get the encrypted local model and global model, and the attack precision is very low. In this method, the attack accura...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a lightweight federated learning privacy protection method based on decentralized security aggregation, and belongs to the technical field of data privacy protection. A safe decentralized aggregation platform is constructed on a user side by utilizing edge nodes and an alliance block chain, and an aggregation process is collaboratively performed on the platform. And each user segments the local model and sends the local model to each connected edge node. And each user generates a global random number, divides the global random number and respectively shares the global random number to the connected edge node. Then, all the edge nodes are subjected to safe decentralized aggregation, each user can receive a global model added with self-defined global random number disturbance, the edge nodes participating in aggregation cannot know the global model, each user can remove the added disturbance, and an original global model is obtained. According to the method, privacy protection can be realized without encryption operation, and the method is superior to the prior art in the aspects of calculation efficiency, model accuracy and privacy protection on member reasoning attacks.

Description

technical field [0001] The invention relates to a lightweight federated learning privacy protection method based on decentralized security aggregation, which aims to realize lightweight training on the user side by using decentralized security aggregation, and reduce the threat of privacy leakage of traditional central aggregators , belongs to the technical field of data privacy protection. Background technique [0002] In recent years, Federated Learning (FL) has been widely used as a new distributed learning framework. [0003] Federated learning allows multiple participants to use local data to jointly train a unified machine learning model under the premise of privacy protection. In each round of training, participants obtain local models individually based on their own datasets, which are then aggregated by a central aggregator that builds a global model and sends it to participants for the next round of training. Although the user's local training data is not disclos...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F21/60G06F21/62G06N20/00
CPCG06F21/602G06F21/6245G06N20/00H04L9/50H04L9/008H04L9/085H04L2209/42H04L63/04H04L63/1466H04L63/1441H04L63/10
Inventor 沈蒙顾艾婧张杰王婧
Owner BEIJING INSTITUTE OF TECHNOLOGYGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products