Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for reducing distributed machine learning communication overhead

A technology of machine learning and communication overhead, applied in machine learning, instrumentation, resource allocation, etc., can solve problems such as prolonging communication time, and achieve the effect of reducing communication traffic and reducing communication overhead

Active Publication Date: 2019-09-27
NANJING UNIV
View PDF5 Cites 21 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, as the scale of the distributed cluster becomes larger and larger, the transfer of gradients and the synchronization of parameters prolong the communication time and become a bottleneck for further efficiency improvement.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for reducing distributed machine learning communication overhead
  • Method for reducing distributed machine learning communication overhead
  • Method for reducing distributed machine learning communication overhead

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] Below in conjunction with specific embodiment, further illustrate the present invention, should be understood that these embodiments are only used to illustrate the present invention and are not intended to limit the scope of the present invention, after having read the present invention, those skilled in the art will understand various equivalent forms of the present invention All modifications fall within the scope defined by the appended claims of the present application.

[0031] The method for reducing the communication overhead of distributed machine learning provided by the present invention can be applied to the fields of image classification, text classification, etc., and is suitable for scenarios where there are a large number of data sets to be classified and a large number of machine learning model parameters are used. Taking the image classification application as an example, in the method of the present invention, the training image data will be distribute...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for reducing distributed machine learning communication overhead, which is based on a parameter server architecture, is suitable for multi-machine cluster distributed machine learning of a data center, and is also suitable for end cloud collaborative distributed machine learning with a server as a cloud end, a mobile phone or an embedded device as a terminal. The method comprises the following steps: firstly, calculating gradients of all working nodes, solving a global momentum by combining two rounds of parameter difference, summing the global momentum and a previous round of memory gradient to obtain a new round of memory gradient, sending a part of the new round of memory gradient to a server node, and accumulating the rest part; then enabling the server node to accumulate all sparse memory gradient sums so as to update parameters and broadcast the parameter difference of two rounds to all working nodes; and finally, enab;omh the working node to receive the two rounds of parameter difference and update the parameters. The method is based on global gradient compression, and only a part of global momentum is transmitted when the working node communicates with the server node, so that the communication overhead in distributed machine learning is reduced.

Description

technical field [0001] The invention provides a method for reducing the communication overhead of distributed machine learning, which can effectively reduce the communication overhead in distributed machine learning. Background technique [0002] Most machine learning models can be formalized as the following optimization problems: [0003] [0004] Where w represents the parameters of the model, n represents the total number of training samples, ξ i Represents the i-th sample, f(w; ξ i ) represents the loss function corresponding to the i-th sample, and d represents the model size. To solve the above optimization problems, stochastic gradient descent (SGD) and its variants are currently the most widely used methods. The momentum gradient descent method (MSGD) in the variant uses exponential weighted average for the gradient, which reduces the influence of the gradient this time, reduces the fluctuation, and converges more stably when it is close to the minimum value. ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/50G06N20/00
CPCG06F9/5061G06N20/00
Inventor 李武军解银朋赵申宜高昊
Owner NANJING UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products