The invention discloses a deep neural network multitask hyper-parameter optimization method. The method comprises: firstly, a data training set of each task being subjected to model training to obtaina multi-task learning network model; secondly, predicting all points in an unknown region, screening candidate points from a prediction result, finally evaluating the screened candidate points, adding the candidate points and target function values of the candidate points into the data training set, and establishing a model, predicting, screening and evaluating again; and so on, until the maximumnumber of iterations is reached, finally selecting a candidate point corresponding to the maximum target function value from the data training set, that is, the hyper-parameter combination of each task in the multi-task learning network model. According to the method, the Gaussian model is replaced by the radial basis function neural network model, and the radial basis function neural network model is combined with multi-task learning and is applied to the Bayesian optimization algorithm to realize hyper-parameter optimization, so that the calculation amount of hyper-parameter optimization isgreatly reduced. The invention further discloses an electronic device and a storage medium.