Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Depth model privacy protection method and device oriented to member reasoning attack and based on parameter sharing

A deep model and privacy protection technology, applied in the field of deep model privacy protection based on parameter sharing, can solve the problems of reducing the prediction ability of the target model, high time complexity, and difficulty in convergence of the target model.

Pending Publication Date: 2021-08-20
ZHEJIANG UNIV OF TECH
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this defense method has certain limitations. Adversarial training will cause high time complexity. In addition, adversarial training will reduce the predictive ability of the target model for normal samples, and because of the change of the loss function when training the model, the target The model will have difficulty converging during training

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Depth model privacy protection method and device oriented to member reasoning attack and based on parameter sharing
  • Depth model privacy protection method and device oriented to member reasoning attack and based on parameter sharing
  • Depth model privacy protection method and device oriented to member reasoning attack and based on parameter sharing

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0025] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, and do not limit the protection scope of the present invention.

[0026] The technical idea of ​​the present invention is: model overfitting is considered to be the main reason for member reasoning attacks, each training sample can have an impact on the prediction of the model, and this impact is reflected in the parameters of the model, which record the training The relevant information of the sample, and the prediction result is calculated from the model parameters. The present invention reduces the influence of training set samples on model parameters through the method of sharing parameters, can effectively alleviate the degree of ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a depth model privacy protection method and device oriented to member reasoning attack and based on parameter sharing. The method comprises the following steps: constructing a target model, and optimizing the network parameters of the target model through an image sample; after optimization is finished, carrying out clustering processing on each layer of network parameters of the target model, and after the network parameters belonging to the same class are replaced by the network parameter average value of the class cluster to which the network parameters belong, the network parameters are optimized; constructing a shadow model having the same structure as the target model, and optimizing network parameters of the shadow model by using the training image sample; constructing a new image sample according to the shadow model; constructing an attack model, and optimizing model parameters of the attack model by using the new image sample; and obtaining a prediction confidence coefficient of the input test image by using the parameter-shared enhanced target model, inputting the prediction confidence coefficient into the parameter-optimized attack model, obtaining a prediction result of the attack model through calculation, and judging whether the test image is a member sample of the target model or not according to the prediction result.

Description

technical field [0001] The invention relates to the fields of computer information security and artificial intelligence security, and in particular relates to a parameter sharing-based deep model privacy protection method and device for member reasoning attacks. Background technique [0002] Deep Learning (DL) is a branch of machine learning inspired by the way the human brain works when it processes data. Specifically, DL forms a mathematical model based on sample data, that is, training data, and gradually extracts higher-level features from the sample data, on which the model can make decisions without human participation. Due to its good performance, DL is widely used in image classification, object recognition, image segmentation, disease prediction and other fields. [0003] While DL is penetrating into academia and industry, its explosive growth and huge potential have also attracted cybercriminals, which has brought serious security issues to the DL community. Gene...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N3/08G06F21/62G06N3/04
CPCG06N3/08G06F21/6245G06N3/045G06F18/23213G06F18/241
Inventor 陈晋音上官文昌郑雅羽
Owner ZHEJIANG UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products