Unlock instant, AI-driven research and patent intelligence for your innovation.

A Federated Learning Method Based on Differential Privacy and Chaos Encryption

A differential privacy and chaotic encryption technology, which is applied in the intersection of information security and artificial intelligence, can solve the problems of high computing cost for privacy protection, data privacy leakage of computing nodes, etc., and achieve the effect of improving the level of privacy protection

Active Publication Date: 2022-07-19
NANKAI UNIV
View PDF1 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The purpose of the present invention is to solve the problems of data privacy leakage of computing nodes and high computational cost of privacy protection in the federated learning system, and provide a federated learning method based on differential privacy and chaotic encryption

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Federated Learning Method Based on Differential Privacy and Chaos Encryption
  • A Federated Learning Method Based on Differential Privacy and Chaos Encryption
  • A Federated Learning Method Based on Differential Privacy and Chaos Encryption

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0046] In a federated learning system, an initialized deep learning model is usually sent to multiple computing nodes by a parameter server. Then, each computing node uses the sample data in the local database to train the local model. After the computing node is trained once, it sends the calculated model parameter gradients to the parameter server. After the parameter server receives the gradient parameters sent by each computing node, it uses the stochastic gradient descent method to update the weight parameters of the global model, and sends the updated weight parameters to all computing nodes. The above training process is repeated many times until the training set conditions are reached. In this way, the local data of the computing nodes can not be uploaded and shared, and multiple computing nodes can jointly train the model.

[0047] However, in some scenarios, the gradient parameters uploaded by computing nodes may reveal local data privacy information.

[0048] The...

specific Embodiment approach

[0085] 1. The kth target type node uses the chaotic system to generate pseudo-random numbers according to the chaotic encryption key, and then scrambles and encrypts the updated model parameters according to the pseudo-random numbers to obtain the gradient Enc (w node,k ).

[0086] 2. The kth target type node uses the chaotic system to generate pseudo-random numbers according to the chaotic encryption key, and then performs addition / subtraction encryption on the updated model parameters according to the pseudo-random numbers to obtain the gradient Enc (w node,k ).

[0087] 3. The k-th target type node uses the chaotic system to generate pseudo-random numbers according to the chaotic encryption key, and then performs scramble, addition, and subtraction hybrid encryption on the updated model parameters according to the pseudo-random numbers to obtain the gradient Enc (w node,k ).

[0088] In the embodiment of the present invention, the computing node may adopt a chaotic system...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A federated learning method based on differential privacy and chaotic encryption. In order to protect the local data information of the computing node from being leaked, in the iterative process, the node adopts the optimization algorithm based on differential privacy to train the model with local data, and then uses the chaotic encryption algorithm to encrypt the updated local model parameters, and the local The model parameter ciphertext is uploaded to the parameter server. The parameter server uses the encrypted model parameters uploaded by multiple computing nodes to update the global model parameters, and sends the updated ciphertext of the global model parameters to each computing node. Next, the computing node decrypts the received global model parameter ciphertext and loads it into the local model for the next iterative training.

Description

technical field [0001] The invention belongs to the cross technical field of information security and artificial intelligence, and particularly relates to a training method based on a federated learning model. Background technique [0002] Federated machine learning (Federated learning / Federated Learning) is a distributed learning algorithm that can train machine learning models on multiple decentralized databases or servers. These devices do not share the data stored in the local database, but share their local Trained model parameters. [0003] In a federated learning system, the parameter server sends an initialized deep learning model to multiple computing nodes. Then, each computing node uses the data in the local database to train the local model, and after training once, sends the calculated model parameter gradient to the parameter server. After receiving the gradient parameters sent by each computing node, the parameter server uses the stochastic gradient descent ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F21/62G06N20/20G06N3/08G06F7/58
CPCG06F21/6245G06N20/20G06N3/08G06F7/582
Inventor 高铁杠张泽辉何宁昕
Owner NANKAI UNIV