Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Federated learning method based on trusted execution environment

A technology of execution environment and learning method, applied in the field of data security, can solve the problems of inability to predict confidential data from patterns, failure to solve the authenticity of training results and misreported data volume, etc., to ensure the integrity of training, privacy security, application Scene wide range of effects

Active Publication Date: 2020-06-05
GUANGZHOU UNIVERSITY
View PDF8 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The Trusted Execution Environment executes the machine learning code using at least one data oblivious program, wherein the data oblivious program is any of memory access patterns, disk access patterns, and network access The pattern makes it impossible to predict the process of classified data from the pattern
Similarly, its system protects private data very well, but it also does not solve the problem of the authenticity of training results and the amount of false data

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Federated learning method based on trusted execution environment
  • Federated learning method based on trusted execution environment
  • Federated learning method based on trusted execution environment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0022] Training deep learning models based on large-scale data is a common solution. The present invention needs to solve two key problems, how to use data while protecting user privacy from being leaked and how to prevent users from falsifying data. For example, if a user wants to obtain a model for handwritten digit recognition, the user can do training in the local safe area, and then send the generated model to the cloud, and other users perform the same operation, and so on, and the end user can download it through the cloud. get the model. Some basic concepts related to the present invention are:

[0023] (1) Deep learning: Deep learning focuses on extracting features from high-dimensional data and using them to generate a model that maps output from the input. Multilayer perceptron is the most common neural network model. In a multi-layer perceptron, the input of each hidden layer node is the output of the previous layer network (plus a bias), each hidden layer node ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the field of data security, and discloses a federated learning method based on a trusted execution environment, which comprises the following steps: generating a security areabased on the trusted execution environment; downloading the initialized model parameters from the cloud by a local user; loading the training algorithm, the training data set, the training data example number and the initialized model parameters returned by the cloud into a security area; obtaining a trained model parameter gradient and generating a digital signature, performing local user identity authentication through a group signature algorithm, and uploading the trained model parameter gradient, a model integration algorithm and the local user identity authentication to a cloud; and enabling the cloud to verify the local user identity authentication, obtain the uploaded model parameter gradient and model integration algorithm after successful verification, placing the model parametergradient and model integration algorithm in a cloud security area, integrating the model, and updating the model parameter gradient. According to the invention, the trusted execution environment is used to generate the security area, the user cannot bypass the training process to directly give the training result, and the training integrity and the user privacy protection are realized.

Description

technical field [0001] The invention belongs to the field of data security, in particular to a federated learning method based on a trusted execution environment. Background technique [0002] Machine learning based on private data has achieved good results in practical applications. Many companies such as Google, Facebook, Apple, etc. have collected massive training data from users and applied powerful GPU computing power to deploy deep learning algorithms. In order to obtain a deeper model, many companies are willing to collect complementary data and collaborate in training. However, this method of training machine learning models by directly gathering raw data sets from multiple users hides huge challenges: issues such as privacy data security, data poisoning, and false reporting of data volume. [0003] The invention patent with the publication number CN108717514A published on October 30, 2018 discloses a data privacy protection method and system in machine learning, wh...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F21/62G06F21/64G06N3/04H04L9/32
CPCG06F21/6245G06F21/64G06N3/04H04L9/3255
Inventor 李进陈煜罗芳李同
Owner GUANGZHOU UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products