Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Lightweight distributed federated learning system and method

A learning system and distributed technology, applied in the field of lightweight distributed federated learning system, can solve problems such as low efficiency of federated learning and inability to reuse open source libraries, achieve the effect of low development cycle and development cost, and ensure data security

Pending Publication Date: 2021-01-15
成都数融科技有限公司
View PDF10 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The present invention provides a lightweight distributed federated learning framework and implementation method to solve the problem of re-implementing the bottom layer of the existing federated learning, the inability to reuse a large number of existing open source libraries, the low efficiency of the model customization federated learning that does not meet the gradient training, and the different The problem of low efficiency of business customized federated learning

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Lightweight distributed federated learning system and method
  • Lightweight distributed federated learning system and method
  • Lightweight distributed federated learning system and method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0053] like figure 1 , figure 2 As shown, a lightweight distributed federated learning system includes a main control end node and multiple collaborative end nodes, and also includes:

[0054] Feature processing module: used for the main control end node to schedule each cooperative end node to perform joint feature processing through the feature preprocessing interface;

[0055] Model training module: used for the main control end node to schedule model training for federated learning of each cooperative end node through the model training interface;

[0056] Model evaluation module: used for the main control end node to aggregate the prediction results of each cooperative end node and evaluate the model performance through the model evaluation interface.

[0057]In this embodiment, the interaction between nodes does not involve specific private data, but only involves irreversible intermediate data, effectively ensuring data security. In the machine learning task life cy...

Embodiment 2

[0079] In this embodiment, joint feature processing includes, but is not limited to, missing value processing, outlier value processing, standardization, normalization, binarization, numericalization, one-hot encoding, polynomial feature construction, and the like. In an optional implementation, if the training parameters include a cross-validation method, then when each cooperative end trains a local model, it no longer only trains a single model, but fixes the method of cross-validation to split the data set, and each cooperative end trains at the same time Multiple models, when communicating with the master control node, also transmit the parameters of multiple models at the same time, achieving cross-validation and reducing the number of communications between nodes.

[0080] In an optional implementation, the main control terminal can realize model selection, and the selection process includes: the main control terminal node iteratively updates the model hyperparameter com...

Embodiment 3

[0082] In this embodiment, each cooperative end has its own local sample data, and the features overlap a lot, but the sample users with overlapping features have less overlap, and federated learning is performed based on overlapping features. This training method is called horizontal federated learning. In an optional implementation, the characteristics of the data of each cooperative end are not completely the same, and federated learning is performed based on overlapping sample users. This training method is called vertical federated learning.

[0083] To aid in understanding, here is an example: Figure 5 As shown, the sample dimensions and characteristics of cooperative end node 1 and cooperative end node 2 have partial overlap. First, determine the intersection feature data X of each cooperative end node 1 and x 2 , and then perform horizontal federated learning. like Figure 4 , for common features, in the joint feature processing, the two cooperative end nodes perfo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a lightweight distributed federated learning system and method, and the system comprises a main control end node, a plurality of cooperation end nodes, and a feature processingmodule which is used for enabling a main control end to schedule all cooperation end nodes for the combined feature processing through a feature preprocessing interface; the model training module that is used for the main control end to schedule each cooperative end node to perform model training of federated learning through a model training interface; and the model evaluation module that is used for aggregating the prediction result of each cooperative end node and evaluating the model performance by the main control end through a model evaluation interface. The beneficial effects of the invention are that the invention achieves the quick integration of various types of open source machine learning libraries through the feature processing module, the model training module and the modelevaluation module; no matter whether a federated learning model uses gradient training or not, the framework can be used, the development cycle and development cost are low for different services, rapid landing can be realized, and the data security of each participant can be guaranteed.

Description

technical field [0001] The invention relates to the field of machine learning, in particular to a lightweight distributed federated learning system and method. Background technique [0002] With the development of the big data era, more and more attention is paid to data security, and regulations are constantly improving. Because federated learning technology can guarantee data privacy and security, it is also receiving more and more attention. Federated learning refers to multiple clients performing joint modeling (machine learning or deep learning models), and during the entire learning process, the client's data does not expose local data to other parties, which can ensure data privacy and security. [0003] In the existing federated learning technology, model training is mostly based on gradient value training, that is to say, it will rely on models that can perform gradient training. For models that do not meet gradient training, the federated learning process needs to...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N20/00
CPCG06N20/00
Inventor 顾见军邓旭宏周宇峰
Owner 成都数融科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products