Check patentability & draft patents in minutes with Patsnap Eureka AI!

Cloud server CPU load prediction method and system based on denoising and error correction and medium

A cloud server and load forecasting technology, applied in the direction of instruments, resource allocation, program control design, etc., can solve the problems of not having universal adaptability, and achieve the effect of eliminating the influence of human factors and strong adaptability

Pending Publication Date: 2021-03-16
SOUTH CHINA UNIV OF TECH
View PDF1 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the implementation of this method depends on the understanding of natural laws and experience, and does not have universal adaptability

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cloud server CPU load prediction method and system based on denoising and error correction and medium
  • Cloud server CPU load prediction method and system based on denoising and error correction and medium
  • Cloud server CPU load prediction method and system based on denoising and error correction and medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0052] In this embodiment, the cloud server CPU load prediction method based on denoising and error correction, the main steps are as follows: decompose and reconstruct the time series of the average CPU utilization of the machine, and then use two prediction models to predict synchronously to obtain the average CPU utilization of the machine The predicted value and the residual term are summed to correct the error. The various steps are realized in the order of sequence decomposition, reconstruction, prediction, and correction. Combining these parts before and after the work can obtain a complete prediction method and realize a prediction model.

[0053] Such as figure 1 As shown, the cloud server CPU load prediction method based on denoising and error correction in this embodiment specifically includes the following steps:

[0054] S1. Using the complete empirical mode decomposition method of adaptive white noise to decompose the sequence to obtain each decomposition sequen...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a cloud server CPU load prediction method and system based on denoising and error correction and a medium. The method comprises the steps: using a complete empirical mode decomposition method (CEEMDAN) of self-adaptive white noise to decompose a sequence to obtain decomposition sequences; calculating the curve curvature similarity between each decomposition sequence and theoriginal sequence; distinguishing an effective sequence and a noise sequence according to the similarity, filtering the noise sequence, and performing fitting again to obtain a new noise-filtered sequence; and predicting an error value by utilizing historical error data to finish error correction. According to the prediction method, denoising processing of the original data can be realized, the noise influence is reduced, the error is corrected, the influence of human factors is eliminated, the prediction accuracy can be improved, and the method has relatively high universality.

Description

technical field [0001] The invention belongs to the technical field of CPU load prediction, and in particular relates to a cloud server CPU load prediction method, system and medium based on denoising and error correction. Background technique [0002] In recent years, the rapid development of cloud computing technology and its applications has resulted in the continuous increase of the user scale of the data center industry and the continuous upgrading of enterprise-level applications. The scale of cloud data centers has been greatly expanded, and the ensuing resource utilization and energy consumption problems are also receiving increasing attention. The energy saving of the data center is an important magic weapon to maintain the good income of the enterprise and break the bottleneck of development. The CPU is the most important energy-consuming component, and the CPU utilization rate is the most important factor affecting the power consumption of the server. Researching...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F9/50G06N3/04
CPCG06F9/505G06N3/044G06N3/045
Inventor 林伟伟游德光
Owner SOUTH CHINA UNIV OF TECH
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More