Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Training method of personalized model of distillation-based semi-supervised federated learning

A training method and semi-supervised technology, applied in the field of personalized model training, can solve the problems of different degrees of importance, uneven data quality, not aggregation methods, etc., to reduce weights, improve performance, and achieve good performance.

Active Publication Date: 2021-08-03
GUANGXI NORMAL UNIV
View PDF7 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, to apply knowledge distillation technology to federated learning, it is necessary to ensure that the distillation is performed on the same data set, and the local data of each client in federated learning is different, so how to construct the same data set on the client to achieve distillation is a problem. problem
In federated learning, clients with different data have different importance of the knowledge provided by the model output due to the uneven data quality, so simply averaging it is not an effective aggregation method

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Training method of personalized model of distillation-based semi-supervised federated learning
  • Training method of personalized model of distillation-based semi-supervised federated learning
  • Training method of personalized model of distillation-based semi-supervised federated learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0023] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with specific examples.

[0024] We define k ∈ K clients owning a local dataset D k , where D k Include labeled local datasets and the unlabeled local dataset Local data for each client k and tend to different distributions, and N u >>N l . In order for the client models to observe on the same dataset, we share the same unlabeled shared data on each client N p >>N l .

[0025] Taking the medical scene as an example, the clients participating in the federated learning training are hospitals in different regions, and the local data sets are medical imaging data sets, such as neuroimaging data of Alzheimer's disease. The label of the data indicates whether it is sick or not.

[0026] see figure 1 , a training method for a personalized model based on distillation-based semi-supervised...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a training method of a personalized model of distillation-based semi-supervised federated learning, which adopts a knowledge distillation technology, and a client side can select a self-designed model architecture by uploading model prediction rather than model parameters, so that privacy information of the client side about the model is well protected, and the shared data and the local data of the client are used for training together, so that the generalization ability of the model is greatly improved. In addition, the aggregation scheme can perform dynamic aggregation according to the importance degree of knowledge provided by each client, so that the aggregated model prediction better fuses the model knowledge of the client. And after the server finishes aggregation, the model prediction distribution information of the public data is not the pseudo label information, so that the communication transmission efficiency is further improved by utilizing the mode.

Description

technical field [0001] The invention relates to the technical field of federated learning, in particular to a method for training a personalized model based on distillation-based semi-supervised federated learning. Background technique [0002] Federated learning collaboratively trains a global model on the premise that a group of clients do not upload local data sets, and each user can only access their own data, thereby protecting the privacy of users participating in the training. Because of its advantages, federated learning has broad application prospects in industries such as medicine, finance, and artificial intelligence, and has become a research hotspot in recent years. However, federated learning focuses on obtaining a high-quality global model by learning the local data of all participating clients, but since the data of each client is heterogeneous in real scenarios, it cannot train a model when faced with data heterogeneity. Global model for all clients. [00...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G16H10/60G16H50/50G06F30/27G06N3/04G06N3/08G06F111/08
CPCG16H10/60G16H50/50G06F30/27G06N3/04G06N3/08G06F2111/08
Inventor 龚艳霞梁媛李先贤欧阳扬
Owner GUANGXI NORMAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products