Deeply differential privacy protection method based on generative adversarial network

A privacy protection and differential privacy technology, applied in the field of deep learning and privacy protection, can solve the problem of leaking sensitive user information, and achieve the effect of speeding up training, reducing the selection range, and optimizing accuracy

Active Publication Date: 2017-11-21
BEIJING TECHNOLOGY AND BUSINESS UNIVERSITY
View PDF9 Cites 104 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The technology of the present invention solves the problem: overcomes the deficiencies of the existing technology, and provides a deep differential privacy protection method based on a generative confrontation network, so as to use differential privacy technology to solve the problem of leaking user sensitive information during deep model training and application
[0006] The technical solution of the p

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Deeply differential privacy protection method based on generative adversarial network
  • Deeply differential privacy protection method based on generative adversarial network
  • Deeply differential privacy protection method based on generative adversarial network

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0028] The present invention will be described below in conjunction with the drawings and specific embodiments. With figure 1 Describes the processing process of the deep differential privacy protection method based on the generative confrontation network.

[0029] Such as figure 1 As shown, the specific implementation steps of the present invention:

[0030] (1) Calculate the upper bound of the privacy budget based on the size of the input data set, query sensitivity, and the probability obtained by the attacker. The calculation method of the upper bound of the privacy budget:

[0031]

[0032] Where ε is the privacy budget, n is the potential data set of the input data set (the potential data set refers to, assuming that the input data set is D, the possible value method of the neighboring data set D'of the data set is n, where D and D 'There is only one piece of data), △q is the sensitivity of the query function q for data sets D and D', △v is the maximum difference between the...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a deeply differential privacy protection method based on a generative adversarial network. The deeply differential privacy protection method is adopted to solve the problem that attackers use own coding and other methods for restoring training set data when a deep learning model is in use, and by adopting the deeply differential privacy protection method, the purpose of protecting users' privacy in a training dataset is achieved. The method includes the steps that according to the potential dataset scale of the input training dataset, a sensitivity degree and the maximum attacker attacking probability are queried to calculate an upper bound of privacy budgets; during deep network parameter optimizing calculation, a differential privacy thought is integrated, noise data is added, on the basis of the characteristic that differential privacy and Gaussian distribution can be combined, the privacy budget of each layer of a deep network is calculated, and during stochastic gradient descent calculation, Gaussian noise is added to make the overall privacy budget minimum; the generative adversarial network is adopted to generate optimal results which the attackers can get, by comparing attack results with initial data, feedback regulation is conducted on parameters of a deeply differential privacy model, and the balance between dataset availability and privacy protection degrees is achieved.

Description

technical field [0001] The invention relates to the fields of deep learning and privacy protection, in particular to a deep differential privacy protection method based on a generative confrontation network. Background technique [0002] In recent years, deep learning has achieved remarkable results in the fields of object detection and computer vision, natural language processing, speech recognition and semantic analysis, and has attracted more and more researchers' attention. Through the hierarchical processing of neural networks, deep learning combines low-level features to form more abstract high-level representation attribute categories or features to discover the distributed feature representation of data. Its model performance is closely related to the scale and quality of training data sets, while training Data sets usually contain more sensitive information. Training data sets are widely used in many fields, including face recognition in the security field, pornogra...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F21/62
CPCG06F21/6245
Inventor 毛典辉李子沁蔡强李海生曹健
Owner BEIJING TECHNOLOGY AND BUSINESS UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products