Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Cloud data center load prediction method based on LSTM (Long Short-Term Memory)

A cloud data center and long-short-term memory technology, applied in the field of cloud computing, can solve problems such as the inability to optimally allocate resources, and achieve the effect of short training time and high learning efficiency

Inactive Publication Date: 2018-06-15
BEIJING UNIV OF TECH
View PDF5 Cites 52 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In view of the current problem that the resources in the cloud data center cannot be allocated optimally, it is necessary to propose a model based on the LSTM neural network, relying on a large amount of historical data for each priority task request sequence and unit request resource application sequence to train and complete An optimized method to accurately predict these two indicators for a period of time in the future

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cloud data center load prediction method based on LSTM (Long Short-Term Memory)
  • Cloud data center load prediction method based on LSTM (Long Short-Term Memory)
  • Cloud data center load prediction method based on LSTM (Long Short-Term Memory)

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0050] The implementation process and precautions of the present invention will be further elaborated below. As mentioned above, there are six indicators to be predicted in the cloud data center, but most of the content of the algorithm is applicable to predict these six indicators. If a certain step has different processing methods for different types of forecasts, there will be special instructions. The algorithm is written in python language, and Tensorflow, data analysis package pandas, numerical calculation extension package numpy and matplotlib.pyplot for drawing images are imported. In this part, an indicator to be predicted is always referred to as "H", and the forecasting methods of the other five indicators are roughly the same.

[0051] S1. Make historical time series and datasets with data stored in files;

[0052] Historical data is often stored in csv format files. To predict H, the first step is to read the historical data of H from the file to form a time se...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a cloud data center load prediction method based on LSTM (Long Short-Term Memory), and aims to solve the problem that optimal utilization can not be obtained by the limited calculation resources of a cloud data center. The method comprises the following steps that: taking the mass historical records of the cloud data center as a basis to manufacture a training sample and atesting sample; in addition, constructing a neural network connected by LSTM units; continuously inputting training samples on batch to obtain an output value. A neural network optimization algorithmadopts a new adaptation moment estimation method, parameters in each unit are continuously updated through iterative training, and global optimum is realized after training is finished, only the testing sample needs to be input into the network to obtain the next prediction value of a sample sequence; and if an input sequence is continuously updated by the prediction value, a prediction value sequence in one future period of time can be obtained.

Description

technical field [0001] The invention relates to the technical field of cloud computing, in particular to a cloud data center load forecasting method based on a long short-term memory network. Background technique [0002] Cloud computing is an increase, use and delivery model of Internet-based related services, usually involving the provision of dynamically scalable and often virtualized resources through the Internet, which can provide on-demand computing to a large number of users with different priority levels through the network resources and calculation results. The resources in the cloud data center usually adopt a pay-as-you-go mode to dynamically provide services to users. [0003] According to the definition of the National Institute of Standards and Technology (NIST), this mode provides available, convenient, and on-demand network access to a shared pool of configurable computing resources (resources include networks, servers, storage, application software, servi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/50G06K9/62G06N3/08
CPCG06F9/505G06N3/08G06F18/214
Inventor 毕敬许伯睿乔俊飞
Owner BEIJING UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products