Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Parameter adjusting method and device and storage medium

A technology for adjusting parameters and hyperparameters, applied in the field of neural networks, which can solve the problems of rising time complexity, need to be improved, and missing optimal hyperparameter combinations.

Inactive Publication Date: 2018-11-06
ZTE CORP
View PDF0 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Among them, the time complexity of the grid search algorithm will increase exponentially with the increase of hyperparameters, which can only be applied to small-scale neural networks with few hyperparameters; the random search algorithm requires multiple iterative sampling to determine the optimal Hyperparameters, when the number of sampling times is insufficient, the optimal hyperparameter combination will be missed. Therefore, the decision-making speed of the optimal hyperparameters in the current common automatic parameter tuning methods needs to be improved

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Parameter adjusting method and device and storage medium
  • Parameter adjusting method and device and storage medium
  • Parameter adjusting method and device and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0022] The embodiment of the present invention provides a parameter adjustment method, such as figure 1 As shown, the method includes:

[0023] S101, perform iterative training of the convolutional neural network model on the value of the hyperparameter according to the preset iterative training times n;

[0024] S102. Determine the hyperparameter value of the value of the hyperparameter in each iterative training;

[0025] S103. Use the value corresponding to the largest hyperparameter value among the hyperparameter values ​​obtained by the n times of iterative training as the optimal value of the hyperparameter.

[0026] In detail, there are many hyperparameters in the convolutional neural network, but the convolutional neural network is not sensitive to some hyperparameters. Whether these hyperparameters are optimal values ​​has little effect on the performance of the model. Therefore, the hyperparameters in the embodiment of the present invention Parameters are generally...

Embodiment 2

[0064] The embodiment of the present invention provides an optional parameter adjustment method, such as figure 2 As shown, the method in the embodiment of the present invention includes Embodiment 1, which realizes the fast search of the optimal hyperparameters in the convolutional neural network model. The main process includes initialization settings, updating hyperparameter values, hyperparameter sampling, model training and model evaluation, Determination of optimal hyperparameters, etc.; in detail, such as image 3 As shown, the method in the embodiment of the present invention includes:

[0065] S301, initializing settings. Configure the hyperparameters that need to be automatically tuned, the value range of each hyperparameter, form the hyperparameter value space, and set the sampling times of each hyperparameter.

[0066] For example, set the hyperparameters and their value ranges that need to be automated, and the number of samples for each hyperparameter. There ...

Embodiment 3

[0093] An embodiment of the present invention provides a parameter adjustment device for a convolutional neural network model, such as Figure 4 As shown, the device includes a memory 10 and a processor 12, the memory 10 stores a computer program, and the processor 12 executes the computer program to implement the method described in any one of the first to second embodiments A step of.

[0094] For example, the processor 12 executes the computer program to implement the following steps:

[0095] Perform iterative training of the convolutional neural network model on the value of the hyperparameter according to the preset number of iterative training times n;

[0096] Determine the hyperparameter value of the value of the hyperparameter in each iteration training;

[0097] Taking the value corresponding to the maximum hyperparameter value among the hyperparameter values ​​obtained by the n iterations of training as the optimal value of the hyperparameter.

[0098] In the em...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a parameter adjusting method and device and a storage medium. The method comprises the following steps of: carrying out convolutional neural network model-based iterative training on a value of a hyper parameter; determining a hyper parameter value of the value of the hyper parameter during the iterative training at each time; and taking the value corresponding to the maximum hyper parameter value in the hyper parameter values obtained via n iterative trainings as an optimum value of the hyper parameter. According to the method, the speed of deciding the optimum hyper parameter in the convolutional neural network model is effectively improved.

Description

technical field [0001] The present invention relates to the technical field of neural networks, in particular to a parameter adjustment method, device and storage medium. Background technique [0002] Deep learning is derived from neural networks, and its core is feature learning, that is, by combining low-level features to form more abstract high-level features, so as to discover the distribution characteristics of data. The convolutional neural network model is a multi-layer neural network. The convolutional neural network model has two sets of parameters, one is the basic parameters, such as the weight and bias of the convolutional layer or the fully connected layer, and the other is the hyperparameter , such as the learning rate, weight decay coefficient, dropout ratio, etc. during network training, need to be set before model training. [0003] The training process of the convolutional neural network model is the process of automatically adjusting the basic parameters ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/02
CPCG06N3/02
Inventor 徐茜屠要峰高洪陈小强李忠良
Owner ZTE CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products