Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Smart campus-oriented distributed machine learning model parameter aggregation method

A machine learning model and smart campus technology, applied in machine learning, computing models, instruments, etc., can solve the problems of local optimization of distributed machine learning training, and achieve the effect of maximizing utilization efficiency, reducing communication volume, and improving training accuracy

Pending Publication Date: 2020-03-27
HANGZHOU DIANZI UNIV +1
View PDF2 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The purpose of the present invention is to overcome the problem that distributed machine learning training falls into local optimum due to non-convex problems, and provides a distributed machine learning model parameter aggregation method for smart campuses

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Smart campus-oriented distributed machine learning model parameter aggregation method
  • Smart campus-oriented distributed machine learning model parameter aggregation method
  • Smart campus-oriented distributed machine learning model parameter aggregation method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0023] The specific embodiments of the present invention will be further described in detail below in conjunction with the accompanying drawings. The specific steps are described as figure 1 shown, where:

[0024] Step 1: Clean and transform the data generated by teachers, students, and other staff's daily behavior, and store it in a memory-mapped database for training.

[0025] Step 2: The main process reads the configuration file, including training parameters and model network. The training parameters mainly include initial learning rate, learning rate adjustment method, impulse value, maximum number of iterations, etc.; the model network is a model described by layer in prototxt format network files. Each calculation process uses the no-replacement extraction method to randomly select local training data from all training data. The final result is that each process has the same number of different data, and the training data is formatted and labeled pictures.

[0026] S...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a smart campus-oriented distributed machine learning model parameter aggregation method, and aims to solve the problem that model training falls into a local optimal solution under a data parallel strategy. Starting from a model aggregation method of a distributed machine learning algorithm, the proportion of each calculation process local model when the parameter server aggregates the local model parameters is determined through the loss function value of each calculation process, so that the training precision is improved; training data are obtained by using a methodof directly extracting the data without putting back in a calculation process, so that the communication overhead is reduced. When the method is applied to synchronization models such as an overall synchronization parallel model and a delay synchronization parallel model, the training precision can be effectively improved, the training speed is not influenced, and the service recommendation accuracy can be effectively improved when the training result is applied to the smart campus.

Description

technical field [0001] The present invention relates to a distributed machine learning model parameter aggregation method for smart campuses, more specifically, the present invention relates to a smart campus-oriented distributed machine learning model parameter aggregation method for the problem of models falling into local optimal solutions . Background technique [0002] With the development of the big data era, traditional machine learning is becoming more and more powerless in the face of massive data. In this context, distributed machine learning came into being. Compared with traditional machine learning training on a single machine, distributed machine learning can make full use of the resources of high-performance computing clusters. Existing distributed machine learning models generally use the parameter server idea, that is, set a parameter server and several computing nodes for training. The parameter server is responsible for collecting and merging the trainin...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N20/00G06Q50/20
CPCG06N20/00G06Q50/20
Inventor 张纪林范禹辰万健周丽任永坚张俊聪魏振国
Owner HANGZHOU DIANZI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products