Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A gradient perturbation-based federated learning data privacy protection method and system

A data privacy and gradient technology, applied in the direction of digital data protection, integrated learning, electronic digital data processing, etc., can solve the problems of reducing model availability, defense scheme noise disturbance, etc., to ensure classification accuracy, ensure that it is not leaked, and maintain The effect of usability

Active Publication Date: 2021-08-13
HUAZHONG UNIV OF SCI & TECH
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, under high privacy protection performance requirements, these defense schemes will introduce too much noise perturbation, which will greatly reduce the usability of the model

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A gradient perturbation-based federated learning data privacy protection method and system
  • A gradient perturbation-based federated learning data privacy protection method and system
  • A gradient perturbation-based federated learning data privacy protection method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0028] In order to make the objectives, technical solutions and advantages of the present invention, the present invention will be described in further detail below with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are merely intended to illustrate the invention and are not intended to limit the invention. Further, the technical features according to each of the various embodiments described below can be combined with each other as long as they do not constitute a collision between each other.

[0029] In the present invention, the terms "first", "second", or the like (if present), and the like in the drawings are used to distinguish the similar object without having to describe a particular order or ahead order.

[0030] figure 1 A flow chart of a gradient disturbance-based federal learning data privacy method is provided for the embodiment of the present invention. See figure 1 Combine figure 2 Furth...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a federated learning data privacy protection method and system based on gradient perturbation, which belongs to the field of data privacy protection. The method includes: using a local model trained by federated learning to perform category prediction on samples in data participants to obtain the original prediction Probability vector; the original predicted probability vector is perturbed to obtain a perturbed prediction whose predicted label is the same as that of the original predicted probability vector and whose gradient of the predicted loss function has the largest angular deviation from the gradient of the predicted loss function of the original predicted probability vector Probability vector; with the goal of minimizing the difference between the original prediction probability vector and the disturbance prediction probability vector of each local model, retrain each local model; aggregate the retrained local models to obtain a global model. The protected federated learning global model can effectively reduce the risk of model prediction output and model gradient leaking the privacy of user participants while maintaining the availability of the model.

Description

Technical field [0001] The present invention belongs to the field of data privacy protection, and more particularly to a federal learning data privacy protection method and system based on gradient disturbance. Background technique [0002] Depth learning as a method of realizing artificial intelligence, has been applied to many fields such as computer vision, data mining and medical diagnosis. Depth study requires massive training data as support, making it challenges for data security and privacy protection. In addition, if you are worried about data privacy, each data participant refuses to share its data information, and the central server cannot be trained to get an efficient and reliable learning model to form data island. The increasingly stringent data privacy protection requirements and data is island have seriously restricted the development and application of deep learning. In order to solve these problems, federal learning came into being. However, for federal learnin...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F21/62G06K9/62G06N20/20
CPCG06F21/6245G06N20/20G06F18/2415
Inventor 王琛刘高扬伍新奎彭凯
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products