Federated learning privacy protection method and system based on homomorphic pseudo-random numbers

A pseudo-random number and privacy protection technology, applied in the field of privacy protection, can solve problems such as high complexity, inapplicability to large-scale federated learning, and leakage of data information, so as to reduce communication costs, ensure security, and protect data privacy.

Active Publication Date: 2020-12-29
SHANDONG UNIV
View PDF6 Cites 19 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Although federated learning does not need to transmit the original data, it only needs to transmit the update gradient value of the model, but these update gradient values ​​​​come from the original data, so these model update gradient values ​​may st

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Federated learning privacy protection method and system based on homomorphic pseudo-random numbers
  • Federated learning privacy protection method and system based on homomorphic pseudo-random numbers
  • Federated learning privacy protection method and system based on homomorphic pseudo-random numbers

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0051] Embodiment 1, this embodiment provides a federated learning privacy protection method based on homomorphic pseudo-random numbers;

[0052] A privacy protection method for federated learning based on homomorphic pseudo-random numbers, including:

[0053] S101: n clients use verifiable secret sharing VSS to generate a key s, divide the key s into n shares, and each client gets its own secret share s i ; At least t clients participate in the recovery key s, and send the key s to the server; n and t are both positive integers; s i Indicates the secret share of the i-th client;

[0054] S102: Each client performs federated learning, and each client uses its own data locally for machine learning model training to generate updated gradient values;

[0055] S103: Each client uses the secret share s i As a seed, use the key homomorphic pseudo-random function to generate a pseudo-random number F(s i ,x); and use random number F(s i , x) Encrypt the updated gradient value to ...

Example Embodiment

[0111] Embodiment 2, this embodiment provides a federated learning privacy protection system based on homomorphic pseudo-random numbers;

[0112] A federated learning privacy protection system based on homomorphic pseudo-random numbers, including: a server and several clients;

[0113] n clients use verifiable secret sharing VSS to generate a key s, divide the key s into n shares, and each client gets its own secret share s i ; At least t clients participate in the recovery key s, and send the key s to the server;

[0114] Each client performs federated learning, and each client uses its own data locally for machine learning model training to generate updated gradient values;

[0115] Each client takes the secret share s i As a seed, use the key homomorphic pseudo-random function to generate a random number F(s i ,x); and use random number F(s i , x) Encrypt the updated gradient value to obtain the updated gradient value ciphertext, and then send the updated gradient value...

Example Embodiment

[0117] Embodiment 3, this embodiment also provides a client.

[0118] A client configured to:

[0119] n clients use verifiable secret sharing VSS to generate a key s, divide the key s into n shares, and each client gets its own secret share s i ; At least t clients participate in the recovery key s, and send the key s to the server;

[0120] Each client performs federated learning, and each client uses its own data locally for machine learning model training to generate updated gradient values;

[0121] Each client takes the secret share s i As a seed, use the key homomorphic pseudo-random function to generate a random number F(s i ,x); and use random number F(s i , x) Encrypt the updated gradient value to obtain the updated gradient value ciphertext, and then send the updated gradient value ciphertext to the server;

[0122] The client receives the updated model fed back from the server.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a federated learning privacy protection method and system based on homomorphic pseudo-random numbers. The method comprises the steps that n clients generate a secret key s through employing a verifiable secret sharing VSS, the secret key s is divided into n parts, and each client obtains a secret share si; at least t clients participate in recovering the key s and send thekey s to a server, wherein n and t are both positive integers, and si represents the secret share of the ith client; each client performs federated learning, and each client locally uses respective data to perform machine learning model training to generate an updated gradient value; each client generates a random number F(si, x) by taking the secret share si as a seed and using a key homomorphicpseudorandom function; the updated gradient value is encrypted by using the random number F(si, x) to obtain an updated gradient value ciphertext, and then the updated gradient value ciphertext is sent to the server; and the clients receive the updated model fed back by the server.

Description

technical field [0001] The present application relates to the technical field of privacy protection, in particular to a method and system for federated learning privacy protection based on homomorphic pseudo-random numbers. Background technique [0002] The statements in this section merely mention the background art related to this application, and do not necessarily constitute the prior art. [0003] The emergence of new technologies such as big data, cloud computing, and deep learning has promoted the vigorous development of artificial intelligence and machine learning, but data security and privacy issues have seriously restricted the practical application of artificial intelligence and machine learning. At present, due to concerns about the risk of data leakage, the sharing and utilization of data by governments, companies, and individuals is extremely limited, and a large amount of data cannot be effectively utilized. Due to factors such as the approval process, busin...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F21/60G06F21/62G06N20/00H04L9/00H04L9/08
CPCG06F21/6263G06F21/602G06N20/00H04L9/008H04L9/0869H04L9/085
Inventor 万志国葛均易
Owner SHANDONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products