Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Layered resource reservation system under cloud computing environment

A cloud computing environment and resource reservation technology, applied in transmission systems, electrical components, etc., can solve problems such as good support for reservation, failure to provide QoS guarantee, and failure to consider the dynamics of grid applications, etc., to ensure normal stability Run, remove effects that interact with each other

Inactive Publication Date: 2011-04-13
HUAZHONG UNIV OF SCI & TECH
View PDF1 Cites 50 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] It can be seen from the above that the current research on resource reservation in cloud computing and grid environments is mainly about unified reservation models, scheduling algorithms, demand forecasting, simulation tools, etc., and analyzes some logical layer scheduling models and If the above models and algorithms are actually used for resource reservation, they need to be based on the resource manager that supports resource reservation in the resource layer. If the resource layer does not support resource reservation well, the resource reservation in the upper layer The reservation model and algorithm cannot finally provide the QoS guarantee brought by resource reservation. Some support related to resource reservation does not consider the resource reservation of different resource types because it involves a single computing resource. Some special applications need Applications with higher execution rights cannot use the provided resource reservations. At the same time, these models and algorithms do not consider the dynamics of grid applications. Once resource reservations are authorized, the resources and usage time that can be used are fixed, and cannot be used according to the application. Make appropriate scaling adjustments based on actual operating conditions

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Layered resource reservation system under cloud computing environment
  • Layered resource reservation system under cloud computing environment
  • Layered resource reservation system under cloud computing environment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0015] The multi-level resource reservation system under the cloud computing environment of the present invention is based on the linux 2.6CPU31set technology, dynamically sets the user process scheduling domain on the resource node, and realizes the reservation of CPU31 computing resources. The present invention will be described in further detail below in conjunction with the accompanying drawings.

[0016] Such as figure 1 As shown, the system of the present invention is composed of a central reservation server, an area reservation server and multi-level resource pools. The system of the present invention is composed of a plurality of areas, and each area has an independent area reserved server, and the independent area reserved server is managed upwardly by the central reserved server, and resources are distributed downwardly to multi-level resource pools. Among them, the central reservation server is divided into two layers: the reservation request response layer and the...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a layered resource reservation system under a cloud computing environment. The system consists of a center reservation server, an area reservation server and a multi-level resource pool, wherein the center reservation server comprises a reservation request response layer and a multi-area center reservation collaborative layer; the reservation request response layer is responsible for sending a resource reservation request and accessing reserved resources; the multi-area center reservation collaborative layer is responsible for receiving the resource reservation request of the center reservation server; the area reservation server is divided into a logic scheduling layer and a resource distributing layer; the logic scheduling layer is responsible for receiving a resource reservation subrequest and distributing resources logically; the resource distributing layer is responsible for executing resource distribution and recovery; and the multi-level resource pool comprises a plurality of computing nodes which can be scheduled. The layered resource reservation system can adapt to the dynamic properties of a cloud computing platform and cloud computing application, thus eliminating mutual influences of various cloud computing applications due to resource competition, and guaranting the normal and stable operations of applications and services.

Description

technical field [0001] The invention belongs to the field of computer applications, and in particular relates to a layered resource reservation system in a cloud computing environment. Background technique [0002] With the continuous development of computer technology, more and more computing resources and storage resources are distributed all over the world. The grid connects computers all over the world through the Internet. These distributed resources are effectively aggregated to provide scientific research and industrial production. Provide the base platform. Cloud computing is a new computing model proposed on the basis of grid computing. It is the core technology of the next-generation network computing platform. It makes full use of various computing resources, storage resources, software resources, and network resources of each computer to provide Powerful processing capability realizes the comprehensive and transparent sharing of resources, constitutes the infras...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): H04L29/08
Inventor 金海吴松石宣化罗雅琴
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products