Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Federal learning data privacy protection method and system based on gradient disturbance

A data privacy and gradient technology, applied in digital data protection, integrated learning, electronic digital data processing, etc., can solve problems such as reducing model availability, noise disturbance of defense schemes, etc., to ensure classification accuracy, ensure that it is not leaked, reduce The effect of the model predicting the effect of degraded performance

Active Publication Date: 2021-07-09
HUAZHONG UNIV OF SCI & TECH
View PDF5 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

This patented technology describes two methods for protecting an ensemble (model) trained under different models from being exposed during federial machine learning: 1) Using input/output predictions or gradients over time to update the entire dataset's belief state, 2) By updating only specific parts of the dataset without changing any others, it prevents attackers who may have access to sensitive personal details about their own datasets. Additionally, this technique helps prevent adversarial attacks against deep neural networks while maintaining good security measures such as confidentiality guarantees. Overall, both techniques aimed at enhancing the overall quality of collaborative filtering systems.

Problems solved by technology

This technical problem addressed in this patents relates to improving the safety and reliance (FCA) of collaborative learning systems used in machine learning applications like Federated Learning. Specifically, there are two main threats: Fake Data Leaks (FDLC), where hackers may compromve sensitive personal data without permission but also expose their private details about them's own dataset. Another concern involves Model Loss Protocol Unreliable Hot Spurts (MHSP). These issues lead to significant leaked data privity exposure when accessing shared models trained with different datasets.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Federal learning data privacy protection method and system based on gradient disturbance
  • Federal learning data privacy protection method and system based on gradient disturbance
  • Federal learning data privacy protection method and system based on gradient disturbance

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0028] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention. In addition, the technical features involved in the various embodiments of the present invention described below can be combined with each other as long as they do not constitute a conflict with each other.

[0029] In the present invention, the terms "first", "second" and the like (if any) in the present invention and drawings are used to distinguish similar objects, and are not necessarily used to describe a specific order or sequence.

[0030] figure 1 It is a flowchart of a gradient perturbation-based federated learning data privacy protection method provided by an embodiment of the present inve...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a federated learning data privacy protection method and system based on gradient perturbation, and belongs to the field of data privacy protection, and the method comprises the steps: carrying out the class prediction of a sample in a data participant through a local model after federated learning training, and obtaining an original prediction probability vector; disturbing the original prediction probability vector to obtain a disturbance prediction probability vector, wherein the prediction label of the disturbance prediction probability vector is the same as the prediction label of the original prediction probability vector, and the angular deviation of the gradient of the prediction loss function relative to the gradient of the prediction loss function of the original prediction probability vector is maximum; retraining each local model by taking the minimum difference between the original prediction probability vector and the disturbance prediction probability vector of each local model as a target; and aggregating the retrained local models to obtain a global model. The protected federal learning global model can effectively reduce the risk that the model prediction output and the model gradient leak the privacy of the user participants on the premise of maintaining the availability of the model.

Description

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products