Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Cloud computing resource pre-distribution achievement method

An implementation method and pre-allocation technology, applied in the field of cloud computing resource management, can solve problems such as affecting user satisfaction, inability to meet user resource requirements, and large processing delays.

Inactive Publication Date: 2013-05-08
CSG EHV POWER TRANSMISSION +1
View PDF6 Cites 35 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Traditional allocation methods are basically allocated in real time based on the actual requests of users. Such allocation must have some potential problems. For example, if the entire system is in a busy state or the request is too sudden, it will cause large The processing delay; moreover, the user's resource requirements may not be met, which will greatly affect the user's satisfaction
Therefore, traditional allocation methods often require users to make resource requests long in advance, which brings some inconvenience to users
In addition, for the purchase of physical resources, the personal experience of the administrator is mostly relied on, and there is no special mechanism to provide relevant reference information for the administrator

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cloud computing resource pre-distribution achievement method
  • Cloud computing resource pre-distribution achievement method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033] The content of the present invention will be described in further detail below in conjunction with the accompanying drawings and specific embodiments.

[0034] Example:

[0035] The present invention proposes a combined resource prediction method based on time series. After obtaining user request history records, the number of user requests per day is counted according to the type of virtual machine, and a time series based on the type of virtual machine is established, and then the combined forecasting model is used to Predict the request situation of the next stage. The implementation steps of the present invention will be described in detail below in conjunction with the accompanying drawings.

[0036] According to the analysis of the problem, in order to establish the matrix of time and virtual machine type, it is necessary to preprocess the historical data and then perform model training. This processing flow can be divided into several steps: data acquisition, d...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a cloud computing resource pre-distribution achievement method. The cloud computing resource pre-distribution achievement method detailedly analyzes design and realization of a model, helps an information technology (IT) administrator to beforehand deploy various virtual machines to satisfy unexpected requests by forecasting possible user requests, and meanwhile forecasts possible physical resource demands aiming at specific resource types so as to beforehand purchase corresponding resources. The cloud computing resource pre-distribution achievement method comprises data collection; data washing, filtering requests of users, wherein the number of times of the requests is smaller than a preset number of times; model training, building a combined forecast model based on an array of time and types of the virtual machines, and then continuously leading in washed data and carrying out the model training according to a time window until the time window is filled, and accomplishing model convergence; data forecast, forecasting resource requests of a preset time based on the trained combined forecast model; and result processing, wherein under the premise of a preset forecast time, forecast results comprise total required quantity of each type of virtual machines, required quantity of the virtual machines in the next circle, and required quantity of a specific physical resource.

Description

technical field [0001] The invention belongs to the field of cloud computing resource management, and in particular relates to a method and system for realizing preallocation of cloud computing resources. Background technique [0002] In the data center, the resource allocation of virtual machines is a basic problem for IT administrators to maintain cloud computing environments. Traditional allocation methods are basically allocated in real time based on the actual requests of users. Such allocation must have some potential problems. For example, if the entire system is in a busy state or the request is too sudden, it will cause large The processing delay; moreover, the user's resource requirements may not be met, which will greatly affect the user's satisfaction. Therefore, traditional allocation methods often require users to submit resource requests long in advance, which brings some inconvenience to users. In addition, for the purchase of physical resources, the person...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F9/50H04L29/08
Inventor 王朝硕周震震朱永虎田应富高锡明曾春朱义郭涑炜邢春晓
Owner CSG EHV POWER TRANSMISSION
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products