Training method and device for federated learning model
A technology of learning models and training methods, applied in the field of financial technology, can solve problems such as low model accuracy and poor results
- Summary
- Abstract
- Description
- Claims
- Application Information
AI Technical Summary
Problems solved by technology
Method used
Image
Examples
example 1
[0086] Initialize non-independent and identically distributed data as the training set for model training, including the following two methods.
[0087] 1. Sort the data according to the number tags, and then divide the data into multiple parts, such as ten parts. Each client will hold data with multiple digital tags, such as two types. For example, after dividing the data into ten parts, perform labeling Sorting, 1-10, then take 1 and 8 for model training on the first client, take 9 and 7 for model training on the second client, so that the data of each client cannot be used as a representative of the global data distribution .
[0088] 2. Using the benchmark data set, the data is divided into ten parts, so that the amount of data on each client varies greatly, so that the data of each client cannot be used as a representative of the global data distribution.
[0089] Select 10 clients to use the above data for model training, and 10 clients obtain the k-1th global model par...
example 2
[0105] Obtain the local model parameters of the kth iteration sent by 10 clients, and then sum the differences between the local model parameters of the kth iteration of all clients and the global model parameters of the k-1th iteration to obtain the sum for: Then multiply the sum and the learning rate, and then add it to the global model parameters of the k-1th iteration, and the global model parameters of the kth iteration are obtained as:
[0106] Step 303, the server broadcasts the global model parameters of the kth iteration to the multiple clients, so that the multiple clients can perform k+1th iteration training.
[0107] In the embodiment of the present invention, the server broadcasts the global model parameters of the kth iteration to multiple clients, and the client does not need to reset the regularization constraints, and directly enables multiple clients to perform the k+1th iteration training, and obtains The local training model parameters of the k+1 iter...
PUM
Login to View More Abstract
Description
Claims
Application Information
Login to View More 


