Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method and device for protecting safety of neural network model

A neural network model and network model technology, applied in biological neural network models, neural learning methods, neural architectures, etc., can solve problems such as attackers or gray production attacks, stealing model sensitive information, etc., to reduce resource consumption and ensure normal operation Effects on operational performance and efficiency

Active Publication Date: 2021-06-11
ALIPAY (HANGZHOU) INFORMATION TECH CO LTD
View PDF9 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, when the training data is sensitive or private data such as user personal information, the trained neural network carries a large amount of sensitive and private information. If the model is directly exposed, it is easy to be attacked by an attacker or a gray product through the model to steal the information in the model. sensitive information carried

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and device for protecting safety of neural network model
  • Method and device for protecting safety of neural network model
  • Method and device for protecting safety of neural network model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0021] Multiple embodiments disclosed in this specification will be described below in conjunction with the accompanying drawings.

[0022] As mentioned above, if all the trained neural network models are exposed, it is easy for attackers or gray products to attack through the model and steal sensitive information in the model. For example, after obtaining the neural network model, the attacker can infer the statistical characteristics remembered in the network layer by visualizing it. For example, suppose the neural network model is used to decide whether to provide a certain service to the user, where The feature remembered by a certain network layer may be: if the user is older than 52 years old, no loan service will be provided. At this time, the attacker can modify the user's age (such as changing 54 years old to 48 years old), so that illegal users can use loan service. For another example, after obtaining the neural network model, the attacker can observe the data dist...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention provides a method for protecting the safety of a neural network model, and the method comprises the steps: obtaining a neural network model which comprises a plurality of network layers obtained through training of training data; for any first network layer, under the condition that parameters of other network layers are fixed, performing first parameter adjustment on the first network layer by using the training data to obtain a first fine adjustment model; determining a first index value of a preset performance index corresponding to the first fine tuning model, wherein the index value of the preset performance index depends on the relative size of the corresponding model, the test loss on the test data and the training loss on the training data; similarly, performing second parameter adjustment on the first network layer by using the training data and the test data to obtain a second fine adjustment model, and determining a second index value; and based on the relative size of the first index value and the second index value, determining the information sensitivity corresponding to the first network layer, and when the information sensitivity is greater than a predetermined threshold, performing security processing on the first network layer.

Description

[0001] This application is a divisional application, which is based on the patent application filed on November 16, 2020, entitled "Method and Device for Protecting Neural Network Model Security", and the application number is: 202011280172.0. technical field [0002] The embodiment of this specification relates to the field of data security technology, and in particular to a method and device for protecting the security of a neural network model. Background technique [0003] At present, it is a classic practice in the industry to use a large amount of data to train a neural network so that the neural network has a good prediction effect. The neural network memorizes the characteristics of the data to give accurate predictions when making predictions. However, when the training data is sensitive or private data such as user personal information, the trained neural network carries a large amount of sensitive and private information. If the model is directly exposed, it is ea...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F21/57G06Q10/06G06N3/04G06N3/08
CPCG06F21/57G06Q10/06393G06N3/08G06N3/045
Inventor 王力周俊
Owner ALIPAY (HANGZHOU) INFORMATION TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products