Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Federated Learning Method Based on Trusted Execution Environment

A technology of execution environment and learning method, which is applied in the field of data security, can solve the problems of not being able to predict confidential data from patterns, not solving the authenticity of training results, and false reporting of data volume, etc., to achieve good performance, privacy protection, and a wide range of application scenarios Effect

Active Publication Date: 2022-08-09
GUANGZHOU UNIVERSITY
View PDF8 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The Trusted Execution Environment executes the machine learning code using at least one data oblivious program, wherein the data oblivious program is any of memory access patterns, disk access patterns, and network access The pattern makes it impossible to predict the process of classified data from the pattern
Similarly, its system protects private data very well, but it also does not solve the problem of the authenticity of training results and the amount of false data

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Federated Learning Method Based on Trusted Execution Environment
  • A Federated Learning Method Based on Trusted Execution Environment
  • A Federated Learning Method Based on Trusted Execution Environment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0022] Training deep learning models based on large-scale data is a common solution. The present invention needs to solve two key problems, how to protect user privacy from being leaked while utilizing data, and how to prevent users from falsifying data. For example, if a user wants to get a model for handwritten digit recognition, the user can do training in the local safe area, and then send the generated model to the cloud, and other users perform the same operation. get the model. Some basic concepts related to the present invention are:

[0023] (1) Deep learning: Deep learning focuses on extracting features from high-dimensional data and using them to generate a model that maps output from input. The multilayer perceptron is the most common neural network model. In a multi-layer perceptron, the input of each hidden layer node is the output of the previous layer of network (plus bias), each hidden layer node calculates its weighted input mean, and the output of the hid...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the field of data security, and is a federated learning method based on a trusted execution environment. The data and the initialized model parameters returned by the cloud are loaded into the safe area, the gradient of the model parameters after training is obtained, and a digital signature is generated. The local user identity authentication is uploaded to the cloud; the cloud verifies the local user identity authentication, and after successful verification, the uploaded model parameter gradient and model integration algorithm are obtained, placed in the cloud security area, the model is integrated, and the model parameter gradient is updated. The invention utilizes the trusted execution environment to generate the security zone, and the user cannot bypass the training process and directly give the training result, thereby realizing the training integrity and user privacy protection.

Description

technical field [0001] The invention belongs to the field of data security, in particular to a federated learning method based on a trusted execution environment. Background technique [0002] Machine learning based on private data has achieved good results in practical applications. Many companies, such as Google, Facebook, Apple, etc., collect massive training data from users and apply powerful GPU computing power to deploy deep learning algorithms. In order to get a deeper model, many companies are willing to collect complementary data and collaborative training. However, this method of training machine learning models by directly assembling multi-user raw data sets hides huge challenges: privacy data security, data poisoning, data volume misreporting and other issues. [0003] The invention patent with the publication number CN108717514A published on October 30, 2018 discloses a data privacy protection method and system in machine learning, which first selects the encry...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F21/62G06F21/64G06N3/04H04L9/32
CPCG06F21/6245G06F21/64G06N3/04H04L9/3255
Inventor 李进陈煜罗芳李同
Owner GUANGZHOU UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products