Adaptive learning rate schedule in distributed stochastic gradient descent

A machine learning and gradient technology, applied in machine learning, computer components, character and pattern recognition, etc.

Active Publication Date: 2019-10-11
IBM CORP
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

assigning a second processing job to a second model learner using the central parameter server

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Adaptive learning rate schedule in distributed stochastic gradient descent
  • Adaptive learning rate schedule in distributed stochastic gradient descent
  • Adaptive learning rate schedule in distributed stochastic gradient descent

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0011] In describing the exemplary embodiments of the invention illustrated in the drawings, specific terminology will be employed for the sake of clarity. However, the invention is not intended to be limited to this description or to any particular term, and it is to be understood that each element includes all equivalents.

[0012] Exemplary embodiments of the present invention may utilize a distributed approach to perform stochastic gradient descent (SGD), where a central parameter server (PS) is used to manage SGD as multiple learner machine process gradients in parallel. The parameter server updates the model parameters based on the results of the processed gradients, and the learner machine can then use the updated model parameters in processing subsequent gradients. In this sense, stochastic gradient descent is performed in a distributed fashion, and thus the process can be called distributed stochastic gradient descent.

[0013] When performing distributed SGD, learne...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to adaptive learning rate schedule in distributed stochastic gradient descent. A method for performing machine learning includes assigning processing jobs to a plurality of modellearners, using a central parameter server. The processing jobs includes solving gradients based on a current set of parameters. As the results from the processing job are returned, the set of parameters is iterated. A degree of staleness of the solving of the second gradient is determined based on a difference between the set of parameters when the jobs are assigned and the set of parameters when the jobs are returned. The learning rates used to iterate the parameters based on the solved gradients are proportional to the determined degrees of staleness.

Description

technical field [0001] The present invention relates to distributed stochastic gradient descent (SGD), and more particularly, to adaptive learning rate scheduling in distributed SGD. Background technique [0002] Stochastic Gradient Descent (SGD) is a method for minimizing an objective function. SGD can be used in machine learning to iterate an objective function in order to minimize error and thereby increase correctness in the built model. Traditionally, SGD is performed using a single processor working serially on the training data. However, due to the large amount of training data, waiting for a single processor can be very slow. Contents of the invention [0003] A method for performing machine learning includes assigning a first processing job to a first model learner using a central parameter server. The first processing job includes solving a first gradient based on the parameter set of the first state. A second processing job is assigned to the second model le...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06N20/00
CPCG06N20/00G06F9/5038G06F18/285G06F18/241G06N7/08G06F18/21
Inventor P·杜贝S·杜塔G·乔希P·A·纳格普卡
Owner IBM CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products