Method for predicting resource performance of cloud server based on LSTM-ACO model

A cloud server and model prediction technology, which is applied in computing models, biological models, neural learning methods, etc., can solve problems such as instability, fluctuating cloud server resource performance, and slow convergence speed

Pending Publication Date: 2021-04-09
XIAN UNIV OF TECH
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

It solves the problem of low prediction accuracy of traditional prediction methods for cloud server resource performance data with large fluctuations
And a time series data calculation method using ant colony algorithm to optimize parameters is proposed, which overcomes the problems of easy to fall into local optimal solution, slow convergence speed and instability in the process of model prediction

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for predicting resource performance of cloud server based on LSTM-ACO model
  • Method for predicting resource performance of cloud server based on LSTM-ACO model
  • Method for predicting resource performance of cloud server based on LSTM-ACO model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0040] The present invention will be described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0041] The present invention predicts the method for cloud server resource performance based on LSTM-ACO model, comprises the following steps:

[0042] Step 1, collect resource and performance data of the cloud server.

[0043] Step 2, obtaining cloud server resource and performance sequence data, said resource and performance sequence data including: CPU idle rate, available memory, average load and response time.

[0044] Step 3, perform preprocessing operations on the sequence data obtained in step 2.

[0045] Step 4, use the data obtained in step 3 to construct an LSTM model, and use the model to obtain the predicted value of the LSTM model for the data obtained in step 3.

[0046] Step 5, use the ant colony algorithm to optimize the parameters of the LSTM model obtained in step 4, and construct the LSTM-ACO model.

[0047] Step 6. Use...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a method for predicting the resource performance of a cloud server based on an LSTM-ACO model. The method comprises the steps of firstly carrying out the preprocessing of time sequence data, and mapping original sequence data to a [0, 1] interval; then determining an LSTM model, training and predicting existing data, and optimizing the LSTM model by using an ant colony algorithm; and finally, inputting the prediction result of the LSTM model for the data of the moment t and the data of the moments t-1, t-2,..., t-n into the LSTM-ACO model, and predicting the data of the moment t. According to the method for predicting the resource performance of the cloud server based on the LSTM-ACO model, the problem that a traditional prediction method is not high in precision in the prediction process is solved, the LSTM parameters are optimized through ACO, the problem that the model is caught in a local optimal solution is avoided, and the prediction convergence speed is increased; and finally, cloud server resource and performance prediction is realized, and the software aging phenomenon is predicted more accurately.

Description

technical field [0001] The invention belongs to the technical field of time series prediction, in particular to a method for predicting cloud server resource performance based on a Long Short-Term Memory (LSTM) cyclic neural network and ant colony optimization (ACO) model. Background technique [0002] With the development of modern computer technology and cloud computing, the use of cloud servers is becoming more and more common. Cloud servers are characterized by long-running, high complexity, and frequent resource exchange, which increases the risk of resource exhaustion and software system anomalies and failures. With the accumulation of failures and resource consumption, the cloud server system will experience slow performance degradation, increased failure rate and even crash. This phenomenon is called "software aging". The main causes of software aging include the consumption of operating system resources, the destruction of data, and the accumulation of errors. Th...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F11/34G06N3/00G06N3/04G06N3/08
CPCG06F11/3433G06N3/049G06N3/08G06N3/006G06N3/045
Inventor 孟海宁李维石月开童新宇冯锴朱磊黑新宏
Owner XIAN UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products