Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Stochastic gradient descent optimization method based on distributed coding

A stochastic gradient descent, distributed coding technology, applied in neural learning methods, resource allocation, computer components, etc., can solve problems such as gradient delay and efficiency decline

Active Publication Date: 2020-05-05
HOHAI UNIV
View PDF3 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Purpose of the invention: The purpose of the invention is to provide an asynchronous stochastic gradient descent optimization method based on distributed coding and node load balancing strategy, mainly by targeting at the data exchange stage The data communication generated when the parameters are updated is encoded and optimized, and a load-balancing strategy is used to estimate the computing power of the nodes in real time to optimize the task allocation between nodes, improve the gradient delay problem, and solve the problems faced by large-scale distributed clusters. The problem of gradient delay and efficiency drop caused by heterogeneous computing nodes and communication load bottlenecks

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Stochastic gradient descent optimization method based on distributed coding
  • Stochastic gradient descent optimization method based on distributed coding
  • Stochastic gradient descent optimization method based on distributed coding

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0049] The specific implementation manners of the present invention will be described below in conjunction with the accompanying drawings.

[0050] like Figure 1-2 As shown, the present invention has designed a stochastic gradient descent optimization algorithm of distributed coding, comprising the following steps:

[0051] If you want to perform MNIST handwritten digit recognition neural network training on a distributed cluster with four computing nodes, the neural network is a fully connected multilayer perceptron with a total of 6 layers. A redundancy setting of r = 2 was used with a sample size of 60,000 and a batch size of 360.

[0052] Step 1: To node N 1 , N 2 , N 3 , N 4 Arranged and combined, there are a total of A combination scheme, denoted as D 1 ,D 2 ,...,D 6 . D. 1 ={N 1 , N 2},D 2 ={N 1 , N 3},...,D 6 ={N 3 , N 4}. The combined result is recorded as

[0053] Step 2: The above 60,000 samples were equally divided into 166 sample batches...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a stochastic gradient descent optimization method based on distributed coding. For the problems of gradient delay and efficiency reduction caused by computing node heterogeneity and communication load bottleneck on a large-scale distributed cluster, the invention provides an asynchronous stochastic gradient descent optimization algorithm adapting to node load balance basedon distributed coding. Coding optimization is mainly carried out on data communication generated during parameter updating in the data exchange stage, a strategy based on load balancing is used for carrying out real-time estimation on the node computing capacity, task allocation between nodes is optimized, and the gradient delay problem is solved. According to the algorithm, the problem that the loss function of the deep neural network is difficult to converge due to the gradient delay problem can be relieved, and the training performance of the large-scale neural network can be better improved, so that the performance of the distributed neural network training algorithm is improved.

Description

technical field [0001] The invention relates to a distributed computing architecture, in particular to a distributed coding-based distributed stochastic gradient descent optimization method. Background technique [0002] The neural network training method based on the gradient descent algorithm has received extensive attention in recent years. However, due to the upper limit of the performance that a single machine can achieve, distributed clusters can improve the speed of deep neural network training. At present, the widely used distributed computing method of deep neural network is the asynchronous gradient descent method, which can better ensure the accuracy of training compared with the parameter averaging method, but the total amount of information communication required by the asynchronous stochastic gradient descent method is compared with Parameter averaging has been greatly increased. At the same time, the asynchronous stochastic gradient descent method has a signi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/50G06F9/54G06N3/08G06K9/00
CPCG06F9/5088G06F9/542G06N3/08G06V30/32G06V10/95
Inventor 谢在鹏李博文张基朱晓瑞徐媛媛叶保留毛莺池
Owner HOHAI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products