Privacy protection method for knowledge migration in distributed machine learning

A machine learning and privacy protection technology, applied in the field of machine learning, can solve problems such as large time overhead, leakage, unprotected aggregated data, etc., and achieve the effects of ensuring security, efficient size comparison, time overhead and communication overhead

Pending Publication Date: 2022-03-29
NANJING UNIV OF SCI & TECH
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] But the process of aggregating data is unprotected, so it is still possible to leak information about the data during the aggregation process
The existing method uses homomorphic encryption to protect the data in the aggregation process. Although it can protect the data, the time overhead is too large

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Privacy protection method for knowledge migration in distributed machine learning
  • Privacy protection method for knowledge migration in distributed machine learning
  • Privacy protection method for knowledge migration in distributed machine learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0017] The present invention proposes a privacy protection method for knowledge transfer in distributed machine learning. Each client first trains its own teacher model, and the service requester provides unlabeled public data to the teacher model for prediction. The cloud server is safe The voting results of the teacher model are aggregated, and the corresponding labels are given through the security comparison algorithm. The service requester uses the samples with the given labels to train to obtain its own student model. During the whole training process, the cloud server did not directly access any user data.

[0018] combine figure 1 , a privacy protection method for knowledge transfer in distributed machine learning, the specific steps are:

[0019] Step 1. The client trains the teacher model, and uses the local data set to train its own teacher model;

[0020] Step 2. Aggregate labels. The service requester provides public data without labels to the client. Each teach...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a privacy protection method for knowledge migration in distributed machine learning, which comprises the following steps that: each client respectively trains an own model as a teacher model, a service requester provides public data which is not labeled for the client to predict by using the teacher model, a cloud server aggregates the prediction result of the teacher model, and the prediction result of the teacher model is transmitted to the client; a corresponding label is given through a security comparison algorithm, and a service requester uses the given label for training to obtain a student model; according to the method, in the whole training process, the aggregated cloud server does not directly access any user data, and other any information is not leaked except for the tag which is the highest ticket after differential privacy noise addition.

Description

technical field [0001] The invention relates to machine learning technology, in particular to a privacy protection method for knowledge transfer in distributed machine learning. Background technique [0002] In recent years, machine learning has developed rapidly and has been widely used in many fields such as natural language understanding, non-monotonic reasoning, machine vision, pattern recognition, etc. However, the wide application of machine learning has increased the privacy impact of machine learning models on sensitive data training. In order to solve this problem, a teacher-student model is proposed: each user trains locally, as a "teacher" model, and trains his own model in a private data set; the service requester provides unlabeled public data to the teacher model for prediction , and choose the labels chosen by most of the teacher models to train their own student models. Throughout the training process, the aggregator did not directly access any user data. ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F21/62G06N3/08G06N20/00
CPCG06F21/6245G06N3/08G06N20/00
Inventor 高艳松李群邱虎鸣郑宜峰
Owner NANJING UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products