Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Federal learning-oriented sample poisoning attack resisting method

A technology against samples and federation, applied in the direction of equipment, computing, and platform integrity maintenance, to achieve the effect of weakening the attack effect and weakening the toxicity

Pending Publication Date: 2022-07-19
DALIAN UNIV OF TECH
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Currently, there are relatively few studies applying such attacks to the training phase of federated learning

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Federal learning-oriented sample poisoning attack resisting method
  • Federal learning-oriented sample poisoning attack resisting method
  • Federal learning-oriented sample poisoning attack resisting method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0023] An adversarial sample poisoning attack method for federated learning, including the following steps:

[0024] S1. The attacker passes local private training samples Add some adversarial perturbations that are imperceptible to the human eye to generate “toxic” adversarial samples and train locally based on these samples;

[0025] S2. In order to dominate the training process of the global model, the attacker increases the training learning rate during the local training process to accelerate the generation of malicious model parameters;

[0026] S3. The attacker uploads its local model parameters to the server to participate in aggregation to affect the global model.

[0027] Among them, in the step 1, the following scenario is defined, assuming that there are m participants participating in the training, m>=2, assuming that the kth participant is the attacker, in the federated learning system, the local training of each participant is regarded as It is a traditiona...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A federated learning-oriented confrontation sample poisoning attack method comprises the following steps: defining the following scenes: assuming that m participants participate in training, mgt; the k-th participant is assumed to be an attacker, and the attack target is that the performance of the federal learning global model on the test set is as poor as possible after the local model parameters participate in aggregation; the method comprises the steps that firstly, an attacker adds certain adversarial disturbance which cannot be perceived by human eyes to local private training samples to generate toxic adversarial samples, and local training is carried out based on the samples; secondly, in order to dominate the training process of the global model, an attacker improves the training learning rate in the local training process so as to accelerate the generation of malicious model parameters; and finally, an attacker uploads local model parameters to a server side to participate in aggregation so as to influence a global model. Under the attack of the method, the performance of the federal global model is remarkably reduced, and the attack method shows good generalization performance.

Description

technical field [0001] The invention belongs to the field of federated learning security technology, and in particular relates to a federated learning-oriented anti-sample poisoning attack method. Background technique [0002] Although machine learning technology has been widely used in various fields, data silos and data privacy issues are still two major challenges that hinder its development, such as: in medical applications, to train a machine learning model with good performance, various medical institutions or Departments provide a wealth of information that can describe a patient's symptoms, and medical data is often highly private and sensitive. Similarly, a large amount of heterogeneous data will be generated in a city's emergency, logistics and security information departments, which exist in the form of data silos and cannot be integrated and utilized. In order to solve the above problems, federated learning technology came into being. Different from the machine ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F21/55G06K9/62
CPCG06F21/55G06F18/214
Inventor 代晓蕊王波
Owner DALIAN UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products