A Federated Learning Adaptive Gradient Quantization Method

A quantification method and adaptive technology, applied in the direction of integrated learning, adjustment of transmission mode, network traffic/resource management, etc., which can solve the problems of large difference in communication time between fast and slow nodes, decreased model accuracy, and slowed down the training process.

Active Publication Date: 2022-06-03
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In the case of a large required global model, network bandwidth limitations and the number of working nodes will exacerbate the communication bottleneck of federated learning, thereby slowing down the overall training process, and heterogeneous and dynamic networks will cause client devices to drop / exit The Straggler Problem
At this time, if the gradient quantization algorithm with uniform precision is adopted, the communication time between the fast and slow nodes will differ greatly, and the process of the fast nodes waiting for the slow nodes to complete the parameter synchronization will cause a lot of waste of computing resources and communication resources, which intensifies the straggler question
At the same time, for nodes with good link status, if the same low-precision quantization gradient is used as nodes with poor link status, the accuracy of the final trained model will also decrease.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Federated Learning Adaptive Gradient Quantization Method
  • A Federated Learning Adaptive Gradient Quantization Method
  • A Federated Learning Adaptive Gradient Quantization Method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0068]

[0071]

[0072] Wherein, for the round-up operation.

[0080]

[0083] Q

[0087]

[0090]

[0094]

[0095] Among them, is the aggregation gradient, N is the number of working nodes, k is the working node, and is the working node k after quantization

[0097]

[0103] The present invention refers to processes of methods, apparatus (systems), and computer program products according to embodiments of the present invention

[0107] Those of ordinary skill in the art will appreciate that the embodiments described herein are intended to assist the reader in understanding the present invention

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a federated learning adaptive gradient quantization method, which initializes training samples and local models of each working node, uses the training samples to train the local model, obtains local gradients, and quantifies the local gradients according to the quantization levels obtained by each working node ; Upload the local gradient to the parameter server for gradient aggregation, and transmit the aggregation result back to each working node; each working node uses the quantized aggregation gradient to update the local model parameters; judge whether the number of iteration rounds meets the preset interval threshold, If it is satisfied, the link status of each working node is broadcast, and the quantization level of itself is adjusted in time, otherwise it enters the iterative training process until the preset condition is reached, and the training is ended; the present invention adaptively adjusts the quantization bit of the gradient according to the real-time bandwidth of the node link, Effectively alleviate the straggler problem. On the basis of completing the task of reducing communication overhead with traditional quantization methods, it improves the utilization of bandwidth resources and completes more efficient federated learning training.

Description

A Federated Learning Adaptive Gradient Quantization Method technical field [0001] The present invention relates to the technical field of gradient quantization, in particular to a federated learning adaptive gradient quantization method. Background technique Due to the continuous expansion of data volume and model scale, traditional machine learning cannot meet application requirements, so the distribution of Machine learning becomes mainstream. In order to complete multi-machine cooperation, communication between nodes is essential. But with model, neural The scale of the network is getting larger and larger, and the amount of parameters to be transmitted each time is also very large, which may cause the communication time to be too long, or even Because the increased communication time offsets the computational time saved by parallelism. Therefore, how to reduce the communication cost has become an important issue. A widely studied topic in the field of distribu...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): H04L1/00H04W28/06G06N20/20
CPCH04L1/0006H04W28/06G06N20/20Y02D30/50
Inventor 范晨昱吴昊章小宁李永耀
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products