Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Differential privacy federal learning method for resisting member reasoning attack

A differential privacy and learning method technology, applied in the field of machine learning, can solve the problems of reducing the performance of the global network model, poor performance of federated learning, and obstacles to federated learning training, so as to reduce the problem of insufficient training, increase performance, and improve performance.

Pending Publication Date: 2022-07-22
NANJING UNIV OF SCI & TECH
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] However, under resource-constrained conditions such as fewer training data sets, federated learning often performs poorly. Under such conditions, the training of federated learning will be hindered.
At the same time, when some clients have insufficient data, the global network model trained by these clients will have a negative impact on the overall global network model when aggregated on the server side
[0004] In terms of defending against membership inference attacks, the existing methods of federated learning generally add noise to the parameters of the global network model in the process of client training the global network model. Although this method can play a role in resisting membership inference attacks, this method tends to significantly degrade the performance of global network models
This method of adding noise to model parameters during the training process obtains the characteristics of resisting member inference attacks at the expense of larger model performance. The defect is obvious and needs to be improved.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Differential privacy federal learning method for resisting member reasoning attack
  • Differential privacy federal learning method for resisting member reasoning attack
  • Differential privacy federal learning method for resisting member reasoning attack

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0039] This embodiment takes centralized federated learning as the basic structure and trains a classifier network (global network model) based on the mnist data set as an example to illustrate the specific implementation measures of the method:

[0040] For the client, the adversarial generative network is trained using local data (mnist data), followed by generating fake data (fake mnist data) through the adversarial generative network using random noise. In the training of the federated learning mnist classifier network, the clients participating in this round of communication can use a certain amount of fake data to join the real data to participate in the training of the mnist classifier network issued by the server, or they can completely use the fake data to participate in the training of the mnist classifier issued by the server. mnist classifier network training, an alternative to the unimproved federated learning approach where clients are trained on real data. The a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a differential privacy federal learning method for resisting member reasoning attacks. The method specifically comprises the steps that each client uses local data for training to generate an adversarial generative network model and generate false data; for each round of federated learning communication, the server side randomly selects a client side participating in the round of communication, and issues global network model parameters and a loss function and an optimizer adopted in the training process; the selected client uses the false data to train a global network model and sends parameters of the trained global network model back to the server; the server side adopts a federated average aggregation method to update global network model parameters; and the server side judges whether the next communication is continued or not, if so, the global network model parameters are continued to be published, otherwise, the communication is ended, and the global network model parameters are saved. According to the method, the data privacy of the client is further protected under the condition of original data islands, and member reasoning attacks can be resisted.

Description

technical field [0001] The invention relates to the technical field of machine learning, in particular to a differential privacy federated learning method for resisting member reasoning attacks. Background technique [0002] Federated learning is a distributed machine learning framework with privacy-preserving technology, which aims to assist in the training of machine learning models through decentralized clients without data leakage. Under the federated learning architecture, the problem of collaboration between different data owners without exchanging data is solved by designing fake models. Since data is not transferred, it can effectively protect user privacy or affect data specifications. [0003] However, under resource-constrained conditions such as fewer training data sets, federated learning often performs poorly, and under such conditions, the training of federated learning will be hindered. At the same time, when some clients have insufficient data, the global ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04L9/40H04L41/14G06N3/04G06N3/08G06N20/00
CPCH04L63/1441H04L41/145G06N3/08G06N20/00G06N3/045
Inventor 陈隆马川韦康李骏
Owner NANJING UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products