Data center task scheduling method based on dynamic temperature prediction model

A prediction model and task scheduling technology, applied in the direction of resource allocation, multi-programming device, energy-saving calculation, etc., can solve the problem of reducing the prediction accuracy of the temperature model, and achieve the effect of improving the cooling efficiency

Inactive Publication Date: 2015-01-28
南京大学镇江高新技术研究院
View PDF4 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, with the operation of the system, the ambient temperature change caused by the heat dissipation of the computing equipment will have a greater impact on the parameters of

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Data center task scheduling method based on dynamic temperature prediction model
  • Data center task scheduling method based on dynamic temperature prediction model
  • Data center task scheduling method based on dynamic temperature prediction model

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0017] The key concepts, definitions and symbols of the present invention:

[0018] 1. Stable state: In computationally intensive cluster applications, the calculation takes a long time. After a sufficient period of time, the ambient temperature and airflow of the cluster will tend to stabilize, and this state will become a stable state.

[0019] , Temperature prediction model: A two-dimensional matrix , It can be distributed according to the current CPU usage of all computing nodes , Predict the inlet temperature distribution of all computing nodes in a steady state by calculating Ac+b, where b is a constant parameter.

[0020] Take as figure 1 Take the system environment shown as an example: use 4 cabinets, each cabinet has 5 layers of racks, and each cabinet uses a power strip for power supply; the cooling fan horizontal blowing represents the cold aisle blowing mechanism; the ventilation entrance of each cabinet A temperature sensor is installed at the place to detect the e...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a data center task scheduling method based on a dynamic temperature prediction model. The method comprises the steps of firstly initializing parameters of a temperature prediction model; obtaining information such as actual computation node inlet temperature and CPU (Central Processing Unit) use rate; updating the parameters of the temperature prediction model through an updating algorithm; scheduling a batch of incoming tasks. The data center task scheduling method based on the dynamic temperature prediction model overcomes the disadvantages and defects of the current static temperature predication model applied to a task scheduling method, and is applicable to the online task scheduling of a computation intensive data center. The data center task scheduling method based on the dynamic temperature prediction model has the advantages that since factors of temperature and air flow are considered during task scheduling and the parameters of the temperature prediction model are dynamically adjusted according to the feedback information of a temperature sensor, the highest inlet temperatures of all computation nodes are enabled to be as low as possible, the cooling efficiency of a cooling system is finally improved, and the goal of greatly saving energy is achieved.

Description

technical field [0001] The present invention relates to a task scheduling method on a large-scale cluster such as a data center, and in particular to a task scheduling method that considers the influence of temperature and airflow on the cooling effect of the cluster by establishing a more accurate temperature prediction model at the software system level. The cooling efficiency of the cooling system can be improved to achieve the purpose of greatly saving energy. Background technique [0002] In the era of big data, all data centers in the world consume about 1.5% of the world's electricity consumption, and the heat dissipation energy consumption of the data center cooling system accounts for 50% of the total energy consumption of the data center. The cooling efficiency of the cooling system is very low due to the "hot spot" phenomenon caused by factors such as heat back flow. The task scheduling method that considers the temperature factor is mainly to avoid hot spots fro...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F9/50
CPCY02D10/00
Inventor 姜志刚钱柱中陆桑璐
Owner 南京大学镇江高新技术研究院
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products