Precision feedback federated learning method for privacy protection

A privacy protection and learning method technology, applied in the field of privacy protection-oriented precision feedback federated learning, can solve problems such as maintaining privacy in violation of federated methods, adjusting different clients, client data leakage, etc., to improve impact, protect privacy, Convergent Stationary Effect

Pending Publication Date: 2021-12-07
BEIHANG UNIV
View PDF2 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In the process of implementing this strategy, there are at least the following problems in the prior art: (1) Privacy issues should be emphasized during the application of federated learning methods, that is, the protection of private data of each client
However, this strategy proposes to extract part of the data from each client to form a shared data set, which

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Precision feedback federated learning method for privacy protection
  • Precision feedback federated learning method for privacy protection
  • Precision feedback federated learning method for privacy protection

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0040] The present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments.

[0041] figure 1 It is a system structure diagram of the present invention, including a central server and N clients, and data is distributed in N clients, and the client and the server only transmit parameters without transmitting data, wherein the server adopts a global model, and the client adopts a local model; for Obtain a global model with better performance, and use federated learning for model training.

[0042] figure 2 This is the flow chart of the privacy-preserving precision feedback federated learning method. At the beginning, the client uses GAN to enhance the data of the local data set to form a shared data set and upload it to the server. The server initializes the global parameters and broadcasts them to the client. N clients use the downloaded global parameters to perform local training on the local dataset. After local tr...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a precision feedback federated learning method for privacy protection. The method comprises the following steps that: 1, each client performs data enhancement on a local data set by using a GAN, and uploads the obtained generated data to a server to form a shared data set; 2, the server initializes the model parameters and broadcasts the model parameters to each client; 3, each client performs local model training by using the downloaded global parameters, and uploads the parameters to the server after training; 4, the server tests each local model to obtain model precision and then generate a new aggregation weight; 5, the server performs model aggregation on the local model by using the aggregation weight; 6, the server performs global training on the aggregated parameters by using the shared data set, and broadcasts the parameters to each client after obtaining a global model; and 7, steps 3-6 are repeated until the model performance meets the requirement. According to the method, the influence on the global model performance caused by non-independent identical distribution of the client data and the training weight of the client can be improved on the premise of protecting the client data.

Description

technical field [0001] The invention belongs to the field of federated learning, and in particular relates to a privacy protection-oriented precision feedback federated learning method, which is dedicated to improving the impact of non-independent and identical distribution of local data of each client on the performance of a global model. Background technique [0002] Data is the foundation of machine learning. As the main direction of artificial intelligence, machine learning needs data to train artificial intelligence models. In most industries, due to issues such as industry competition, privacy security, and complex administrative procedures, data often exists in the form of isolated islands, and the performance of artificial intelligence models trained only using data in isolated data islands often cannot meet task requirements. . Aiming at the dilemma of data islands and data privacy, the framework of federated learning algorithm came into being. [0003] Under the ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06N20/00G06N3/04G06N3/08G06F21/62
CPCG06N20/00G06N3/08G06F21/6245G06N3/045
Inventor 李文玲李钰浩白君香刘杨
Owner BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products