A nested meta-learning method and system based on federated architecture

A meta-learning and federated technology, applied in the field of machine learning, can solve the problems of global model aggregation speed that needs to be improved, communication overhead is too large, etc., to achieve the effect of improving generalization and performance, good generalization, and high flexibility

Active Publication Date: 2022-06-03
军事科学院系统工程研究院网络信息研究所
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Some researchers proposed a personalized federated average algorithm, that is, in the federated average algorithm, each user trains the meta-learning model locally, and then passes the parameters to the server for aggregation to promote the generalization ability of the model, but the performance of the global model and The aggregation speed needs to be improved; a similar research program also includes the federated meta-learning algorithm, that is, the meta-learning algorithm is used in the federated stochastic gradient descent architecture to obtain the local gradient of each user and update the model parameters, but as mentioned above, The federated stochastic gradient descent algorithm requires frequent communication between users and servers, resulting in excessive communication overhead for this scheme

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A nested meta-learning method and system based on federated architecture
  • A nested meta-learning method and system based on federated architecture
  • A nested meta-learning method and system based on federated architecture

Examples

Experimental program
Comparison scheme
Effect test

Embodiment approach

[0084] Nested Meta-Learning Algorithm Flow

[0085] enter: N : the number of users; : Global meta-learning algorithm; : local meta-learning algorithm; : the global learning rate; : local learning rate; : The number of episode rounds of global training; : The number of episode epochs for local training.

[0086] output: : parameters of the global model

[0087] for do

[0088] # perform local meta-learning

[0089] Pick a subset of users from all users U ;

[0090] for each user do

[0091] Assign global model parameters to local models ;

[0092] for local episode rounds do

[0093] user u Sampling from own data to construct local meta-learning tasks ;

[0094] Computing gradients for meta-learning task training ;

[0095] Update local meta-learning model parameters ;

[0096] end for

[0097] end for

[0098] Global model parameters first take the average of all local model parameters

[0099] # Perform global meta-learning

[0...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The present invention proposes a nested meta-learning method and system based on federated architecture. The method includes: selecting m from N clients 1 clients, m 1 Clients train m based on their local data and global model parameters in the current state. 1 Each client's local model parameters, N and m 1 Both are positive integers and N≥m 1 ; The central server according to the received m 1 local model parameters of clients to update the global model parameters; select m from N clients 2 clients, m 2 Clients determine m based on their local data and updated global model parameters. 2 Each client's global subtask and calculate the parameter gradient generated by the global subtask through learning the objective function, m 2 It is a positive integer and N≥m 2 ; The central server according to the received m 2 The parameter gradients of the global subtasks of each client are used to adjust the updated global model parameters.

Description

technical field [0001] The invention belongs to the field of machine learning, and in particular relates to a nested meta-learning method and system based on federated architecture. Background technique [0002] Federated learning is a decentralized machine learning architecture designed to learn models from distributed mobile devices, which solves the problem that data centers do not always have access to large-scale training data. Furthermore, since the data in each mobile device is privacy-sensitive, federated learning also has significant privacy-preserving advantages over the centralized machine learning model training process. [0003] Small sample learning can quickly grasp new target concepts with only a small amount of data. It aims to explore how to use the knowledge and experience summed up from existing samples to solve new problems when the sample size of new target categories is very small. Methods. Meta-learning is one of the current mainstream solutions for...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): H04L67/01G06N3/08G06N3/04G06K9/62
CPCY02D10/00
Inventor 张洪广杨林马琳茹杨雄军刘錞
Owner 军事科学院系统工程研究院网络信息研究所
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products