Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Federal learning data processing system based on gradient compression

A federated and gradient technology, applied in the computer field, can solve problems such as large time consumption, long total model training time, and low model training efficiency

Active Publication Date: 2021-05-18
上海嗨普智能信息科技股份有限公司 +1
View PDF4 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] However, federated learning involves frequent communication between the server and the client during the training process. Compared with the time spent on model training on the client, the communication between the server and the client takes more time, making the total model training time Long, low model training efficiency

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Federal learning data processing system based on gradient compression
  • Federal learning data processing system based on gradient compression
  • Federal learning data processing system based on gradient compression

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0017] In order to further explain the technical means and effects of the present invention to achieve the intended purpose of the invention, the following is a specific implementation of a gradient compression-based federated learning data processing system proposed in accordance with the present invention in conjunction with the accompanying drawings and preferred embodiments. And its effect, detailed description is as follows.

[0018] The embodiment of the present invention provides a federated learning data processing system based on gradient compression, such as figure 1 As shown, it includes a server, M clients, a processor and a memory storing computer programs, wherein a first database and a second database are stored in the server, and the fields of the first database include client id and client The latest round of federation aggregation that the terminal participated in, the fields of the second database include the round of federation aggregation and the global mo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a federal learning data processing system based on gradient compression, which comprises a server, M clients, a processor and a memory in which a computer program is stored, wherein the server stores a first database and a second database. The field of the first database comprises a client id and the round of the last participation of the client in federal aggregation, the field of the second database comprises the round of federal aggregation and a global model corresponding to the round, and the first database and the second database are both dynamically updated along with training of a federal aggregation model. According to the invention, the number of bytes transmitted between the server and the client is reduced, so that the time consumed by communication between the server and the client is reduced, and the efficiency of federal aggregation model training is improved.

Description

technical field [0001] The invention relates to the field of computer technology, in particular to a federated learning data processing system based on gradient compression. Background technique [0002] Federated learning is a machine learning setting. Under the premise of ensuring that the training data is scattered in each client and does not go out of the local area, multiple clients cooperate to train the model under the coordination of the server. The training of the entire model is an iterative process involving several rounds of communication between the server and the client. In each round, the server randomly selects several clients, and sends the latest federated average model saved on the server to all selected clients. Each client performs model training based on local data, updates model parameters several times, and uploads the model update together with the total number of samples in its local training set to the server. After receiving the model updates fr...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F16/21G06F16/23G06N20/00
CPCG06F16/217G06F16/23G06N20/00
Inventor 蔡文渊叶田地高明钱卫宁周傲英顾海林徐林昊孙嘉袁国玮
Owner 上海嗨普智能信息科技股份有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products