User data reconstruction attack method oriented to deep federal learning

A user data and federation technology, which is applied in neural learning methods, digital data protection, electrical digital data processing, etc., can solve the problems of poor concealment, overpowering, privacy leakage, etc., and achieve the effect of improving concealment and authenticity

Pending Publication Date: 2019-07-12
WUHAN UNIV
View PDF0 Cites 82 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Existing attack methods usually consider a malicious user to attack a target and infer its data privacy, but usually only infer the category representative data. For example, for a face recognition task, existing attack methods can infer the general sample, but it is impossible to reconstruct the sample of the ID owned by a specific user. Although such a general sample can characterize the characteristics of the category, it cannot actually cause a privacy leak
This is because malicious users can only access the updated model from the server, which is aggregated from the last round of updates of all users, and cannot attack specific targets.
In addition, existing attack methods require modifying the structure of

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • User data reconstruction attack method oriented to deep federal learning
  • User data reconstruction attack method oriented to deep federal learning
  • User data reconstruction attack method oriented to deep federal learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0074] 1) The malicious server participates in the regular federated learning process. First, the user agrees on the goal and model of collaborative learning, and then executes iteratively: the server sends the shared model, and the user trains the model locally and uploads the model parameters to the server. The terminal then aggregates these parameter updates until the model converges. The update of each round can be expressed as

[0075]

[0076] m t Indicates the shared model after round t update, Indicates that the t-th round of parameter update from user k is performed locally by private data pair M t calculated.

[0077] 2) The malicious server builds a multi-task generative adversarial network model locally, which includes a generative model G and a discriminative model D, where D simultaneously performs the task of discriminating the authenticity, category, and user identity of the input sample. The structure of the model can be expressed as

[0078] D. real...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a user data reconstruction attack method oriented to deep federal learning, which can reconstruct private data of a specific user and consider that attacks are implemented by amalicious server, so that negative effects are prevented from being introduced into an original shared model compared with a conventional attack method which can only reconstruct category representation data. Furthermore, the method introduces a multi-task generative adversarial model to simulate the distribution of user data, and the model is used for training the authenticity and category of aninput sample and the identification of the user identity to which the model belongs, so that the quality of the generated sample is improved. In order to better distinguish different users, the method introduces an optimized user data representative calculation method to describe user characteristics participating in federal learning, and the method is used for supervising training of a generative adversarial model. For an existing federal learning architecture concerning privacy protection, privacy leakage can be caused by a data reconstruction attack based on a multi-task generative adversarial model provided by the invention.

Description

technical field [0001] The invention relates to a user data reconstruction attack method oriented to deep federated learning, and belongs to the field of artificial intelligence security. Background technique [0002] In recent years, deep learning techniques have been increasingly applied in the network field, such as learning tasks combined with group perception. Utilizing traditional centralized training methods requires storing crowdsourced data locally, which usually brings about problems such as large-capacity data transmission, high computing requirements, and privacy leaks. Therefore, as a mobile edge computing framework for deep learning, the collaborative learning framework has received extensive attention and research, which can enable multiple data sources to benefit from the shared model trained by all data without uploading data to the central storage. [0003] The federated learning framework is one of the current mainstream collaborative learning frameworks:...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F21/55G06F21/62G06N3/04G06N3/08
CPCG06F21/552G06F21/6245G06N3/08G06N3/048G06N3/045
Inventor 王志波宋梦凯郑思言王骞
Owner WUHAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products