Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method and device for neural network machine learning model training

A machine learning model and training method technology, applied in the computer field, can solve the problems of waste, the computing performance of a single GPU cannot be fully utilized, and the convergence accuracy of the model is reduced, so as to achieve the effect of shortening the cycle.

Pending Publication Date: 2019-05-14
ALIBABA GRP HLDG LTD
View PDF7 Cites 47 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In the worst case, multi-card expansion is not even possible due to the relatively small computing communication at the beginning
On the other hand, if the mini-batch size (size) is doubled in order to improve the calculation and communication ratio, when it is higher than the optimal empirical value, the model convergence accuracy will be greatly reduced
Furthermore, as the parallel scale increases, the computing performance of a single GPU cannot be fully utilized, which also causes waste

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and device for neural network machine learning model training
  • Method and device for neural network machine learning model training
  • Method and device for neural network machine learning model training

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0042] In order to make the purpose, technical solution and advantages of the application clearer, the embodiments of the application will be described in detail below in conjunction with the accompanying drawings. It should be noted that, in the case of no conflict, the embodiments in the present application and the features in the embodiments can be combined arbitrarily with each other.

[0043] In a typical configuration of the present application, a computing device includes one or more processors (CPUs), input / output interfaces, network interfaces, and memory.

[0044] Memory may include non-permanent storage in computer readable media, in the form of random access memory (RAM) and / or nonvolatile memory such as read only memory (ROM) or flash RAM. Memory is an example of computer readable media.

[0045]Computer-readable media, including both permanent and non-permanent, removable and non-removable media, can be implemented by any method or technology for storage of info...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method and a device for neural network machine learning model training, and the method comprises the steps: applying the method and device to a distributed computing frame which comprises a plurality of computing nodes, segmenting training data into training data slices in advance, and enabling the number of the segmented slices to be the same as the number of the computing nodes participating in the calculation; a computing node obtaining training data slices and trains local model parameters; the computing node transmits the trained local model parameters to a parameter server; and the computing node updating local model parameters according to the global model parameters returned by the parameter server and continues to train the local model parameters. According to the method, the calculation acceleration ratio at multiple nodes can almost reach a linear ideal value, and the period of model training is greatly shortened.

Description

technical field [0001] This application relates to but is not limited to computer technology, especially a training method and device for a neural network machine learning model. Background technique [0002] After the Neural Machine Translation (NMT, Neural Machine Translation) model was proposed, it has been continuously developed in recent years due to the obvious improvement in translation effect. At present, in some languages ​​and scenarios, the translation quality can even reach the level of human translation. [0003] However, due to the complex structure of the NMT model, and the training process of the deep neural network model itself generally involves a large amount of calculations, the NMT system often requires a long training period. For example, using 30 million training data in a processor such as a single Training on a graphics processing unit (GPU) card requires more than 20 days of training to obtain a preliminary usable model. [0004] The existing neur...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/04G06N3/08G06F17/28
Inventor 孟晨王思宇宋楷杨军骆卫华
Owner ALIBABA GRP HLDG LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products