Deep learning hyper-parameter tuning improvement method based on Bayesian optimization

A technology of deep learning and hyperparameters, applied to instruments, character and pattern recognition, computer components, etc., can solve problems such as unfavorable engineering applications, increased computing costs, and large fluctuations in optimization results, so as to improve optimization efficiency and take a long time to solve , the effect of speeding up the optimization speed

Inactive Publication Date: 2018-09-25
中科弘云科技(北京)有限公司
View PDF0 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the random experiment design method or the artificially set initial point cannot guarantee the rationality of the initial point distribution, which has a great impact on the simulation model of the objective function; the orthogonal sampling method can ensure the rationality of the sampling distribution, but increases the Calculation cost; compared with the random sampling method, the traditional Latin hypercube experimental design method has a more reasonable data distrib

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Deep learning hyper-parameter tuning improvement method based on Bayesian optimization
  • Deep learning hyper-parameter tuning improvement method based on Bayesian optimization
  • Deep learning hyper-parameter tuning improvement method based on Bayesian optimization

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0043] Example one

[0044] Figure 4 A method for tuning and improving deep learning hyperparameters based on Bayesian optimization is shown. The overall steps are as follows:

[0045] Step 1. Use the fast optimal Latin square experimental design algorithm to generate the initial point set X=x 1 ...x t .

[0046] The specific steps of the fast optimal Latin square experimental design algorithm:

[0047] a, given n p , N v , N s , And the optimization interval of each variable; the number of sampling points in this embodiment n p = 16, variable dimension n v = 2, n s =1, the optimization interval is both [0,1];

[0048] b. Calculate the theoretical interval division number n d ,

[0049] c. Calculate the intermediate quantity n b ,

[0050] d. Calculate the actual number of initial points generated

[0051] e. Use the translational propagation algorithm to fill the Latin hypercube experimental design space.

[0052] Further, the specific method of filling the Latin hypercube experimental...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a deep learning hyper-parameter tuning improvement method based on Bayesian optimization. The method comprises the steps that a fast optimal Latin square experimental design algorithm is used to generate an initial point set, and the generation rate of effective evaluation points is improved; parallel computing is used to acquire the response of the initial point set in a computer, and the response of multiple points on an objective function is calculated at the same time to construct a data set; and a Bayesian optimization iterative process is initiated, and a parallelcomputing method is used in the computing process to accelerate the whole optimization process. According to the invention, the defects of long time consumption and large performance fluctuation of atraditional Bayesian optimization algorithm can be effectively solved; the optimization speed is accelerated through parallel computing; and the optimization efficiency is significantly improved.

Description

technical field [0001] The present invention relates to an optimization and improvement method, in particular to an optimization and improvement method for deep learning hyperparameters based on Bayesian optimization. Background technique [0002] Deep learning is a method based on data representation learning in machine learning, which is mainly manifested in building models and using data for learning. Hyperparameters are model parameters whose values ​​are set before starting the learning process, rather than the parameter data obtained through training. The selection of hyperparameters in deep learning is difficult and irregular, and there are unpredictable effects between different hyperparameters. Its debugging is very time-consuming, and the evaluation of each hyperparameter combination requires a large number of iterative calculations. For such problems, some classic optimization algorithms such as particle swarm optimization algorithm, simulated annealing algorithm...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/62
CPCG06F18/214G06F18/24
Inventor 曹连雨
Owner 中科弘云科技(北京)有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products