Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Privacy model training method and device based on small amount of public data

A technology for public data and models, applied in the Internet field, can solve the problems of sensitive data training privacy theft of neural network models, and achieve the effects of easy deployment, strong controllability, and low privacy protection overhead.

Pending Publication Date: 2021-01-08
INST OF INFORMATION ENG CAS
View PDF0 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0009] The present invention provides a privacy model training method and device based on a small amount of public data to solve the problem that the privacy of the neural network model using sensitive data training is stolen by attackers

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Privacy model training method and device based on small amount of public data
  • Privacy model training method and device based on small amount of public data
  • Privacy model training method and device based on small amount of public data

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0041]In order to enable those skilled in the art to better understand the solutions of the present invention, the technical solutions in the embodiments of the present invention will be described clearly and completely with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only It is a part of the embodiments of the present invention, not all the embodiments. Based on the embodiments of the present invention, all other embodiments obtained by those of ordinary skill in the art without creative work shall fall within the protection scope of the present invention.

[0042]The privacy model training method of the present invention includes the following steps:

[0043]1) For N parts of sensitive data, the data of each part is different. Use N parts of data to train N neural network teacher models to obtain the teacher set model {f1...fn};

[0044]Further, the N neural network teacher models (f1...fn}, it can use the sam...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a privacy model training method based on a small amount of public data and an electronic device. The privacy model training method comprises steps of obtaining N neural networkteacher models by using a plurality of trainings; respectively inputting a small amount of public data xi into the N neural network teacher models to obtain a statistical voting result of each publicdata xi for each label k; noise being added to all statistical voting results, and public data xi and corresponding labels meeting the differential privacy principle being obtained; optimizing an adversarial generative network through a large number of random noise vectors and a pre-trained discriminant neural network, and generating a large number of label-free data; and jointly training a student model through the public data xi meeting the differential privacy principle, the corresponding label and a large amount of unlabeled data pair pre-trained auto-encoders to obtain a private student model. According to the method, only a small amount of public data is needed to train one private student model, physical isolation and network isolation of sensitive data are realized, and a problem of low accuracy of the private student model is solved.

Description

Technical field[0001]The invention belongs to the Internet field, and specifically relates to a neural network model privacy protection training method and device based on differential privacy, semi-supervised learning, and teacher-student knowledge aggregation.Background technique[0002]In recent years, deep learning technology has made shocking breakthroughs in many fields, such as computer vision, natural language processing, and reinforcement learning. At the same time, the development of deep learning technology is also inseparable from a large amount of training data. At present, many extremely powerful applications use a large amount of sensitive data to train models, for example, use a large number of patients' medical data to train medical diagnosis systems, and use a large amount of user financial data for financial risk control.[0003]Although deep learning tools can greatly facilitate industrial production and life, recent studies have shown that deep learning models are v...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F21/62G06N3/04G06N3/08G06N20/00
CPCG06F21/6245G06N3/049G06N3/08G06N20/00G06N3/045
Inventor 葛仕明刘浩林刘博超王伟平
Owner INST OF INFORMATION ENG CAS
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products