Check patentability & draft patents in minutes with Patsnap Eureka AI!

Deep neural network model parallel mode selection method

A neural network model and mode selection technology, applied in the field of deep learning, can solve problems such as long training time and many network model parameters, achieve high parallel performance and realize the effect of automatic selection

Active Publication Date: 2021-03-23
JIANGNAN INST OF COMPUTING TECH
View PDF2 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although the deep network greatly improves the accuracy rate, it also makes the network model parameters more and more, and the training time is getting longer and longer, which has become a major obstacle to the rapid development and wide application of deep learning technology.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Deep neural network model parallel mode selection method
  • Deep neural network model parallel mode selection method
  • Deep neural network model parallel mode selection method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0019] Embodiment: a kind of deep neural network model parallel mode selection method, the input parameter of artificial intelligence training task comprises neural network model file, calculation node quantity and single training sample data size, and described neural network model file comprises the quantity of batch_size, model parameter and data types;

[0020] The parallel mode selection method includes the following steps:

[0021] S1. The distributed expansion component in the artificial intelligence framework calculates the parameter data volume of the entire neural network model according to the parameter quantity and data type of the neural network model, and according to the single training sample data size in the input parameters and the batch_size size in the neural network model file , calculate the data volume of the input data, the sum of the parameter data volume and the data volume of the input data is the total data volume of the neural network model;

[00...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a deep neural network model parallel mode selection method. The method comprises the following steps: S1, calculating the total data volume of a whole neural network model; S2,judging whether the total data volume of the neural network model obtained in the step S1 exceeds the total available memory volume of a single computing node to be trained, if not, executing the step S3, and if yes, executing the step S4; S3, selecting a data parallel mode; S4, segmenting a network layer of the neural network model, obtaining the number of computing nodes needing to be distributed by the neural network model according to a segmentation result, executing S5 if the number of the computing nodes in the input parameters is less than twice of the number of the nodes needed by model segmentation, and otherwise, executing S6; S5, selecting a model parallel mode; S6, selecting a hybrid parallel mode including data parallelism and model parallelism. According to the invention, through information acquisition and analysis of model parameters, hyper-parameters and data volume, automatic selection of a distributed extended parallel mode is realized, and high parallel performanceis ensured.

Description

technical field [0001] The invention relates to a method for selecting a parallel mode of a deep neural network model, which belongs to the technical field of deep learning. Background technique [0002] The distributed training of the data parallel mode stores a backup of a model on each computing node, and processes different parts of the data set on each computing node. The data parallel mode training method needs to combine the results of each working node, and between nodes Synchronize model parameters between. The distributed training of the model parallel mode assigns different network layers of the neural network model to different computing nodes, or assigns different parameters within the same layer to different computing nodes, and different computing nodes are responsible for the training of different parts of the network model. The hybrid parallel mode is to have both model parallelism and data parallelism in a batch of computing nodes for distributed training....

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/08
CPCG06N3/08
Inventor 刘鑫刘沙彭超朱传家陈德训黄则强陆旭峰裴阳
Owner JIANGNAN INST OF COMPUTING TECH
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More