Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method and apparatus for distributed training of machine learning model

A machine learning model and machine learning technology, applied in the field of machine learning, can solve the problems of insufficient node information sharing, limited amount of information, and slow convergence of distributed training.

Active Publication Date: 2020-10-23
BEIJING DAJIA INTERNET INFORMATION TECH CO LTD
View PDF11 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

At this time, the communication volume of a single node depends on the number of nodes selected for communication, and the amount of information shared by each node in each iteration of other nodes is limited, requiring an average of log 2 Only N~N / 2 iterations can share the information of all nodes, resulting in insufficient information sharing between nodes, and the overall convergence process of distributed training slows down

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and apparatus for distributed training of machine learning model
  • Method and apparatus for distributed training of machine learning model
  • Method and apparatus for distributed training of machine learning model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0060] In order to enable ordinary persons in the art to better understand the technical solutions of the present disclosure, the technical solutions in the embodiments of the present disclosure will be clearly and completely described below in conjunction with the accompanying drawings.

[0061] To this end, the embodiment of the present disclosure provides a method for distributed training of machine learning models, which can be applied to each node using N nodes in a distributed system architecture, and each node can use computing devices such as servers or server clusters accomplish. The inventive idea of ​​this method is that, through the correspondence between the preset order and the node distance, the target node corresponding to the node distance can be determined for each node in each iteration step, so that each node can communicate with its target node to obtain the model information ( For example, model parameters and / or gradient values), the machine learning mod...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a method and apparatus for distributed training of a machine learning model. The method comprises the following steps of: in a preset number of iteration steps of distributedtraining, obtaining the sequence of the current iteration steps, the preset number and the N being in a logarithmic relationship; based on the sequence, obtaining one node from the N nodes as a targetnode of the current node in the current iteration steps; communicating with the target node to obtain model information shared by the target node; updating the machine learning model of the current node according to the model information of the current node and the model information of the target node, so as to synchronously obtain the model information of the N nodes by the machine learning model of each node after completing the preset number of iteration steps. According to the method and apparatus of the embodiments, the convergence efficiency of distributed training can be ensured; meanwhile, the communication traffic of a single node can be reduced to S, and the effect of shortening the communication time is achieved; in addition, each node only shares information with its respective target node, so that the load balance among the nodes can be ensured.

Description

technical field [0001] The present disclosure relates to the technical field of machine learning, and in particular to a method and device for distributed training of machine learning models. Background technique [0002] As the data and models for machine learning become larger and larger, the storage and computing capabilities of a single card / computer can no longer meet the training needs of big data and large models, and distributed machine learning emerges as the times require. Currently, distributed machine learning in related technologies: [0003] Taking the Ring-allreduce architecture as an example, N computing devices form a ring, and each computing device is a worker node. In an iterative process, each node divides the model information / gradient to be communicated into N parts, each worker completes its own training data (mini-batch) training, calculates the gradient, and passes the gradient to the next one in the ring worker, and it also receives the gradient f...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/04G06N3/08G06F16/27
CPCG06N3/08G06F16/27G06N3/045
Inventor 石红梅廉相如
Owner BEIJING DAJIA INTERNET INFORMATION TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products