Decentralized parameter server optimization algorithm

A decentralized and optimized algorithm technology, applied in the field of deep learning, can solve problems such as aggravation, bandwidth waste, and communication congestion

Inactive Publication Date: 2019-04-09
SUN YAT SEN UNIV
View PDF8 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

With the increase of network scale and parallelism, if the hardware bandwidth cannot be increased in the same prop

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Decentralized parameter server optimization algorithm
  • Decentralized parameter server optimization algorithm

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0049] The traditional central parameter server algorithm is often used in network model training based on distributed deep learning. The present invention optimizes the traditional central parameter server algorithm and removes the parameter server in the traditional central parameter server algorithm. The global network model training is carried out in the communication network. The communication network has n working nodes. The global network for deep learning training includes an M-layer network. The global network model is divided according to the layers of the global network, and part of the divided models are assigned to Each working node is maintained and updated through the parameter server thread, and the data set for model training is divided into n blocks. Each piece of data is individually assigned to a working node for local model training. Each working node adopts gradient descent The parameter update in the iterative model training is carried out by the iterativ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a decentralized parameter server optimization algorithm, which is characterized in that a parameter server in a centralized parameter server algorithm in the prior art is removed, working threads of an original parameter server are dispersed into each working node, and when model training is carried out, a random gradient descent algorithm is adopted to carry out parameterupdating of multiple iterations. The iteration process comprises the following steps of firstly, carrying out gradient calculation on each working node by adopting forward propagation and backward propagation; then, each working node collecting gradients of a network layer set by a part of models in charge of the distributed working threads of the parameter server and updating the parameters by using a random gradient descent algorithm; and finally, taking out the updated parameters of all the working nodes to form parameters of a global network model, and performing multiple iterations to complete model training.

Description

technical field [0001] The present invention relates to the technical field of deep learning, and more specifically relates to a decentralized parameter server optimization algorithm. Background technique [0002] At present, asynchronous data parallel model training based on parameter servers is often carried out in distributed deep learning frameworks. However, due to different network scales and hardware bandwidth limitations, problems such as update stop and communication waiting often occur in the training process, because whether it is a single The parameter server is still a hierarchical parameter server design. The traffic of the entire communication network and the calculation of using the gradient update model are concentrated on the parameter server node, the communication traffic is concentrated in the center of the communication network, and the calculation of a specific period is also concentrated in the communication network. In the center, the communication o...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): H04L12/24H04L29/08
CPCH04L41/0823H04L67/10
Inventor 李欣鑫吴维刚
Owner SUN YAT SEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products