Load prediction method and device and resource scheduling method and device for multiple cloud data centers

A load prediction, multi-data technology, applied in the field of cloud computing and data mining, to achieve the effect of improving prediction accuracy

Active Publication Date: 2021-08-06
NANJING UNIV OF POSTS & TELECOMM
View PDF4 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This leads to dynamic changes in the computing resource requirements required by each, and there are large differences in the load fluctuation trends of different data centers, so that the load predict

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Load prediction method and device and resource scheduling method and device for multiple cloud data centers
  • Load prediction method and device and resource scheduling method and device for multiple cloud data centers
  • Load prediction method and device and resource scheduling method and device for multiple cloud data centers

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0069] The present invention designs a cloud multi-data center-oriented load prediction method and resource scheduling method. The system takes the log record files of each virtual machine acquired by the cloud data center as input, and preprocesses the data to predict load linearity and non-linearity. The linear change finally enables the system to generate corresponding resource allocation and scheduling strategies according to the load prediction results.

[0070] A load forecasting method for cloud multi-data centers of the present invention, such as figure 2 shown, including the following steps:

[0071] Step 1, first obtain the log record file of the virtual machine on the server in the cluster from the cloud data center, and the log file of the virtual machine includes the usage of various resources occupied by the virtual machine at each time point. Extract the required feature quantity from the log record file and convert it into a time series data format that the s...

Embodiment 2

[0135] Based on the same inventive concept as in Embodiment 1, the embodiment of the present invention is a load forecasting device for cloud multi-data centers, including:

[0136] The data processing module is used to obtain the log record file that records the resource usage of the virtual machine at each point in time, extract the required feature data and historical load data from it, and convert the feature data and historical load data into corresponding input features sequence and historical load vectors;

[0137] The nonlinear component prediction module is used to calculate the nonlinear component of the load prediction by using the pre-built neural network model based on the obtained input feature sequence and historical load vector;

[0138] A linear component forecasting module, configured to use a pre-built autoregressive model to calculate a linear component of the load forecast based on the obtained historical load vector;

[0139] The prediction result calcul...

Embodiment 3

[0142] Based on the same inventive concept as Embodiment 1, a resource allocator in the embodiment of the present invention includes:

[0143] The load prediction module is used to calculate and obtain the load prediction results of the virtual machines on each server in the cluster in the cloud multi-data center environment based on the above method;

[0144] The resource scheduling module is configured to generate corresponding resource scheduling policies based on the load prediction results of the virtual machines on each server.

[0145] For the specific implementation scheme of each module of the device of the present invention, refer to the implementation process of each step of the method in Embodiment 1.

[0146] Those skilled in the art should understand that the embodiments of the present application may be provided as methods, systems, or computer program products. Accordingly, the present application may take the form of an entirely hardware embodiment, an entire...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a load prediction method for multiple cloud data center. The method comprises the following steps of: acquiring a log record file for recording the resource use condition of a virtual machine at each time point, and extracting required characteristic quantity data and historical load data from the log record file; converting the characteristic quantity data and the historical load data into corresponding input characteristic sequences and historical load vectors; using a pre-constructed neural network model to calculate and obtain a nonlinear component of load prediction; calculating to obtain a linear component of load prediction by using a pre-constructed autoregression model; and integrating a nonlinear component and a linear component of the load prediction to obtain a final load prediction result. The method comprehensively considers the linear trend and nonlinear characteristics of a load sequence along with the time change in a multi-data center environment, combines the neural network model with a statistical learning method of the autoregression model, and can effectively improve the prediction precision of the future load.

Description

technical field [0001] The present invention specifically relates to a load prediction method for cloud multi-data centers, and also relates to a resource scheduling method for cloud multi-data centers, which belongs to the technical field of cloud computing and data mining. Background technique [0002] The ever-increasing computing demand of cloud computing technology has prompted the continuous expansion of the scale of cloud data centers. It is estimated that by the end of 2022, the domestic data center business market will grow to 320 billion yuan, and its scale structure will also change from a single data center to a cloud multi-data center. In order to achieve green and energy-saving development, it is necessary for the data center to have the decision-making ability to dynamically adjust its internal resource allocation, and integrate computing resources to achieve service elasticity. Effective load forecasting is a prerequisite for flexible resource allocation and ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F9/50G06N3/04G06N3/08
CPCG06F9/505G06F9/5077G06N3/08G06N3/047G06N3/048G06N3/044Y02D10/00
Inventor 徐小龙孙维
Owner NANJING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products