Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Cloud data center energy-saving scheduling implementation method based on rolling grey prediction model

A gray prediction model and cloud data center technology, applied in the field of cloud computing energy-saving scheduling, can solve problems such as virtual machine scheduling integration strategy defects, to achieve the effect of ensuring cloud service experience, avoiding overload or no-load phenomenon, and improving indicators

Active Publication Date: 2017-06-27
SOUTH CHINA UNIV OF TECH
View PDF5 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

At present, the starting point of research on cloud computing energy saving at home and abroad is different, and there are certain deficiencies. At present, there are still major deficiencies in the workload monitoring and prediction of cloud data centers in the case of traffic bursts. There are flaws in the scheduling integration strategy of

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cloud data center energy-saving scheduling implementation method based on rolling grey prediction model
  • Cloud data center energy-saving scheduling implementation method based on rolling grey prediction model
  • Cloud data center energy-saving scheduling implementation method based on rolling grey prediction model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0021] In order to make the technical solutions and advantages of the present invention clearer, further detailed description will be given below in conjunction with the accompanying drawings, but the implementation and protection of the present invention are not limited thereto.

[0022] 1. Strategic Framework

[0023] 1.1 Cloud computing resource intelligent scheduling framework

[0024] figure 1 It is the architecture diagram of the cloud computing platform resource intelligent scheduling framework, which is divided into the host layer, virtual machine layer, performance evaluation layer, scheduling layer, and user layer from bottom to top. The scheduling layer and the evaluation layer are the core of the entire energy-saving strategy framework. Each layer will be explained below.

[0025] The host layer refers to all servers in the cloud data center, including all physical host nodes. These hardware devices are the bottom infrastructure of the cloud environment and provide us wi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a cloud data center energy-saving scheduling implementation method based on a rolling grey prediction model. According to the method, the energy-saving process of the cloud data center is abstracted to a load prediction module, an error checking module, a thermal perception classification module and a virtual machine scheduling module, wherein the load prediction module is used for predicting the workload of the data center at the next moment to obtain the load utilization rate of each host; the thermal perception classification module divides the thermal states of all hosts according to the prediction values of the load utilization rate of the hosts, and determines that the utilization rate of the hosts in a higher thermal state is at a higher level, and the utilization rate of the hosts in a cooler state is at a lower level; and in order to maintain that most of the hosts are in a relatively mild thermal state, the virtual machine scheduling module carries out migration, integration and other operations to the virtual machines on each host according to the classification results of the thermal state to finally achieve the purposes of guaranteeing the service quality of the data center and reducing the energy consumption of the data center. According to the energy-saving scheduling implementation method disclosed by the invention, the problem that a traditional grey model has the difficulty of low precision due to partial value deficiency can be solved.

Description

Technical field [0001] The invention belongs to the field of cloud computing energy-saving scheduling, and specifically relates to a cloud data center energy-saving scheduling implementation method based on a rolling gray prediction model. Background technique [0002] As an emerging technology in the information and communication technology industry, cloud computing has gradually entered millions of households. It has been favored by the majority of enterprises and individual users for its high efficiency, low threshold, and high scalability. With the gradual maturity of cloud computing and the continuous enrichment of user needs, the scale of a series of supporting facilities such as data center servers is also growing rapidly. Large-scale cloud computing data centers containing tens of thousands of service nodes have also been established all over the world, which has allowed more computing and storage resources to be stored in the cloud, but this has also caused a series of e...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04L29/08H04L12/12H04L12/24
CPCH04L12/12H04L67/1008H04L41/145H04L41/142H04L67/61Y02D30/50
Inventor 刘发贵王彬
Owner SOUTH CHINA UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products