Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Deep Differential Privacy Preservation Method Based on Generative Adversarial Networks

A privacy protection and differential privacy technology, applied in the field of deep learning and privacy protection, it can solve the problem of leaking sensitive user information, and achieve the effect of speeding up training, realizing usability and privacy protection, and reducing the scope of selection.

Active Publication Date: 2019-06-28
BEIJING TECHNOLOGY AND BUSINESS UNIVERSITY
View PDF9 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The technology of the present invention solves the problem: overcomes the deficiencies of the existing technology, and provides a deep differential privacy protection method based on a generative confrontation network, so as to use differential privacy technology to solve the problem of leaking user sensitive information during deep model training and application
[0006] The technical solution of the present invention: a deep differential privacy protection method based on generative confrontation network feedback, which is used to solve the problem that the attacker uses self-encoding and other methods to restore the training set data in the application of the deep learning model, and applies the deep differential privacy protection method Realize the privacy protection purpose of the training data set

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Deep Differential Privacy Preservation Method Based on Generative Adversarial Networks
  • A Deep Differential Privacy Preservation Method Based on Generative Adversarial Networks
  • A Deep Differential Privacy Preservation Method Based on Generative Adversarial Networks

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0028] The present invention will be described below in conjunction with the accompanying drawings and specific embodiments. Which attached figure 1 The process of deep differential privacy protection method based on generative adversarial network is described.

[0029] Such as figure 1 Shown, the concrete implementation steps of the present invention:

[0030] (1) Calculate the upper bound of the privacy budget based on the size of the input data set, the query sensitivity, and the probability obtained by the attacker. The calculation method of the upper bound of the privacy budget is:

[0031]

[0032]Where ε is the privacy budget, n is the potential data set of the input data set (the potential data set refers to, assuming the input data set is D, the possible value method of the nearest neighbor data set D' of the data set is n, where D and D 'only one piece of data difference), △q is the sensitivity of the query function q for data sets D and D', △v is the maximum d...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a deeply differential privacy protection method based on a generative adversarial network. The deeply differential privacy protection method is adopted to solve the problem that attackers use own coding and other methods for restoring training set data when a deep learning model is in use, and by adopting the deeply differential privacy protection method, the purpose of protecting users' privacy in a training dataset is achieved. The method includes the steps that according to the potential dataset scale of the input training dataset, a sensitivity degree and the maximum attacker attacking probability are queried to calculate an upper bound of privacy budgets; during deep network parameter optimizing calculation, a differential privacy thought is integrated, noise data is added, on the basis of the characteristic that differential privacy and Gaussian distribution can be combined, the privacy budget of each layer of a deep network is calculated, and during stochastic gradient descent calculation, Gaussian noise is added to make the overall privacy budget minimum; the generative adversarial network is adopted to generate optimal results which the attackers can get, by comparing attack results with initial data, feedback regulation is conducted on parameters of a deeply differential privacy model, and the balance between dataset availability and privacy protection degrees is achieved.

Description

technical field [0001] The invention relates to the fields of deep learning and privacy protection, in particular to a deep differential privacy protection method based on a generative confrontation network. Background technique [0002] In recent years, deep learning has achieved remarkable results in the fields of object detection and computer vision, natural language processing, speech recognition and semantic analysis, and has attracted more and more researchers' attention. Through the hierarchical processing of neural networks, deep learning combines low-level features to form more abstract high-level representation attribute categories or features to discover the distributed feature representation of data. Its model performance is closely related to the scale and quality of training data sets, while training Data sets usually contain more sensitive information. Training data sets are widely used in many fields, including face recognition in the security field, pornogra...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F21/62
CPCG06F21/6245
Inventor 毛典辉李子沁蔡强李海生曹健
Owner BEIJING TECHNOLOGY AND BUSINESS UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products