Method and apparatus for distributed training of machine learning models

A machine learning model and machine learning technology, applied in the field of machine learning, can solve the problems of limited information, slow distributed training convergence process, insufficient node information sharing, etc.

Active Publication Date: 2021-01-12
BEIJING DAJIA INTERNET INFORMATION TECH CO LTD
View PDF11 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

At this time, the communication volume of a single node depends on the number of nodes selected for communication, and the amount of information shared by each node in each iteration of other nodes is limited, requiring an average of log 2 Only N~N / 2 iterations can share the information of all nodes, resulting in insufficient information sharing between nodes, and the overall convergence process of distributed training slows down

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and apparatus for distributed training of machine learning models
  • Method and apparatus for distributed training of machine learning models
  • Method and apparatus for distributed training of machine learning models

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0060] In order to enable ordinary persons in the art to better understand the technical solutions of the present disclosure, the technical solutions in the embodiments of the present disclosure will be clearly and completely described below in conjunction with the accompanying drawings.

[0061] To this end, the embodiment of the present disclosure provides a method for distributed training of machine learning models, which can be applied to each node using N nodes in a distributed system architecture, and each node can use computing devices such as servers or server clusters accomplish. The inventive idea of ​​this method is that, through the correspondence between the preset order and the node distance, the target node corresponding to the node distance can be determined for each node in each iteration step, so that each node can communicate with its target node to obtain the model information ( For example, model parameters and / or gradient values), the machine learning mod...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The present disclosure relates to a method and apparatus for distributed training of machine learning models. The method includes: in the preset number of iteration steps of distributed training, obtaining the order of the current iteration step; the preset number has a logarithmic relationship with the N; based on the order, obtaining a node from N nodes as The target node of the current node in the current iteration step; communicate with the target node to obtain the model information shared by the target node; update the machine learning model of the current node according to the model information of the current node and the model information of the target node to complete the preset amount After the iteration step, the machine learning models of each node obtain the model information of the N nodes synchronously. In this embodiment, the convergence efficiency of distributed training can be guaranteed; at the same time, the communication volume of a single node can be reduced to S, achieving the effect of shortening the communication time; in addition, each node only shares information with its own target node, which can ensure that each node Load balancing among them.

Description

technical field [0001] The present disclosure relates to the technical field of machine learning, and in particular to a method and device for distributed training of machine learning models. Background technique [0002] As the data and models for machine learning become larger and larger, the storage and computing capabilities of a single card / computer can no longer meet the training needs of big data and large models, and distributed machine learning emerges as the times require. Currently, distributed machine learning in related technologies: [0003] Taking the Ring-allreduce architecture as an example, N computing devices form a ring, and each computing device is a worker node. In an iterative process, each node divides the model information / gradient to be communicated into N parts, each worker completes its own training data (mini-batch) training, calculates the gradient, and passes the gradient to the next one in the ring worker, and it also receives the gradient f...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06N3/04G06N3/08G06F16/27
CPCG06N3/08G06F16/27G06N3/045
Inventor 石红梅廉相如
Owner BEIJING DAJIA INTERNET INFORMATION TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products