Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A data center scheduling system and method

A data center and scheduling system technology, applied in digital transmission systems, transmission systems, data exchange networks, etc., can solve the problems of high scheduling complexity, high computing cost, and large computing resource consumption, and achieve easy information and algorithm complexity Effects of Low, Optimal Throughput

Active Publication Date: 2021-10-15
HUAZHONG UNIV OF SCI & TECH
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] Aiming at the defects of the prior art, the purpose of the present invention is to solve the deficiencies of the existing data center network scheduling strategy. For example, if the data flow scheduling is improper based on the fat tree topology, some paths will be seriously congested, and the scheduling complexity is extremely high, and the computing resources The consumption is too large. Among the scheduling schemes based on network boundaries, the Fastpass scheme has a high computational cost. Once the data demand prediction of the Mordia scheme is inaccurate, it will lead to technical problems of greater delay.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A data center scheduling system and method
  • A data center scheduling system and method
  • A data center scheduling system and method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0056] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention. In addition, the technical features involved in the various embodiments of the present invention described below can be combined with each other as long as they do not constitute a conflict with each other.

[0057] Aiming at the deficiency of the existing data center network scheduling strategy, the present invention provides a low-complexity scheduling system based on the fat tree topology, and is dedicated to improving the throughput and delay performance of the data center network.

[0058] In order to achieve the above object, the present invention is achieved through the following technical sol...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a data center scheduling system and method. The scheduling system can determine the time and path allocation scheme by generating a fat tree conflict-free path set, thereby ensuring the unity of time allocation and path allocation. Including: the information interaction between the central controller and the data layer, and the scheduling strategy of the central controller. The central controller needs to collect the virtual queue length information from the server, determine the scheduling scheme according to the virtual queue length information and the randomly selected conflict-free data set, and then feed back the scheduling scheme information to the server and the uplink switch. The scheduling system designed in the present invention only needs the length of the virtual queue at the current moment and the information of the non-conflicting path set used at the previous moment, and generates the non-conflicting path set used at the current moment by randomly replacing part of the paths, which can achieve zero delay in network transmission . The path replacement strategy adopted is only related to the length of the virtual queue, and the optimal throughput can be achieved by designing the replacement probability reasonably.

Description

technical field [0001] The present invention relates to the field of data center networks, and more specifically, to a data center scheduling system and method. Background technique [0002] With the rapid development of emerging technologies such as cloud computing, big data, and virtualization, data center traffic and bandwidth have grown exponentially. Cisco's latest report shows that cloud data center traffic will exceed 14ZB in 2020, an increase of 262% compared to 2015. Behind the explosive growth of data volume and computing volume is the rapid growth of data storage and computing costs, which promotes the evolution of data centers from server rooms to ultra-large-scale deployments, and the number of data centers with hundreds of thousands or even millions of servers continues to increase. [0003] For ultra-large-scale data centers, it is necessary to provide efficient interconnection among huge server clusters. However, the traditional three-tier data center netwo...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): H04L12/803H04L12/813H04L12/863H04L47/20
CPCH04L47/125H04L47/20H04L47/50
Inventor 罗晶晶喻莉陈雅梅
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products