Security defense method for data manipulation attacks in federated learning

A security defense and data technology, applied in neural learning methods, machine learning, digital data protection, etc., can solve problems such as difficult detection and troubleshooting of attack methods, hidden safety hazards, lack of threats, etc., to achieve wide application prospects and research value, low Effects of Computational Overhead, Securing Data, and Securing Neural Network Models

Active Publication Date: 2020-07-28
NANJING UNIV
View PDF4 Cites 24 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, existing research work lacks an analysis of how to defend against threats posed by users participating in federated learning
Since the threat of data manipulation attacks comes from within users participating in federated learning, malicious users can hide among normal users, their attack methods are more hidden, and the attack methods are more difficult to detect and troubleshoot. Existing methods for security issues in federated learning It is difficult for existing defense technologies to effectively resist the above two data manipulation attacks, which also brings huge security risks to users participating in federated learning

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Security defense method for data manipulation attacks in federated learning
  • Security defense method for data manipulation attacks in federated learning
  • Security defense method for data manipulation attacks in federated learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0042] The present invention will be further explained below in conjunction with the accompanying drawings and specific embodiments.

[0043] The embodiment of the invention discloses an implementation method of a security training framework for defending against data manipulation attacks in federated learning.

[0044] The federated learning data manipulation attack provided by the embodiment of the present invention, such as figure 1 Shown: It includes three execution subjects: several normal users, a malicious user and a central server. The normal users, malicious users and the central server jointly perform federated learning to complete specified image classification tasks. The normal users hold some normal training data for training normal local models, and the malicious users hold some normal training data and some malicious training data for data manipulation attacks to train malicious local models. A user's training data is private and not disclosed to the public. ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a security defense method for data manipulation attacks in federated learning. The method comprises the following steps: 1, enabling a central server to receive local model parameters uploaded by users, and calculating the similarity degree of the local model parameters uploaded by each user and a fusion coefficient corresponding to each user, wherein the local model parameters are local model parameters obtained after the users adopt private training data to train for one round; 2, after the central server receives the local model parameters of the local users in one period, enabling the central server to calculate a weighted average value of the local model parameters of each user according to the fusion coefficient to obtain global model parameters, wherein the period is a preset local model parameter updating round number; 3, issuing the global model parameters to the corresponding users, and enabling the users to update the local model parameters after receiving the global model parameters.

Description

technical field [0001] The present invention relates to a method for realizing a security training framework of federated learning, and more specifically relates to a method for realizing a security training framework for defending against data manipulation attacks in federated learning. Background technique [0002] Federated learning is an emerging deep learning framework. In the traditional centralized deep learning, the central server needs to collect a large amount of user data for training the neural network model (referred to as the model), but due to the large network communication overhead of data transmission, the ownership of user data and the privacy of user data, etc. , user data for deep learning is often difficult to obtain. And federated learning adopts another way of training the neural network model: in a round of training, each user uses its private data to train the local model, and then uploads the parameters of the local model to the central server, an...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F21/55G06F21/60G06K9/62G06N3/08G06N20/00
CPCG06F21/554G06F21/604G06N3/08G06N20/00G06F18/22
Inventor 毛云龙袁新雨赵心阳仲盛
Owner NANJING UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products