Large-scale processing task scheduling method for income driving under cloud environment

A technology for processing tasks and scheduling methods, applied in the field of distributed computing, can solve problems such as reducing resource leasing overhead, extending task scheduling length, and non-mapping schemes, etc., to achieve the effect of reducing resource leasing costs and meeting performance requirements

Inactive Publication Date: 2013-04-03
BEIJING UNIV OF POSTS & TELECOMM
View PDF0 Cites 29 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the pay-as-you-go billing model of cloud computing makes the resource leasing expenses brought about by processing these tasks a factor that must be considered
Document 3 (Jian Li, Sen Su, Xiang Cheng, Qingjia Huang, Zhongbao Zhang, "Cost-Conscious Scheduling for Large Graph Processing in the Cloud," In Processings of the 13th International Conference on High Performance Computing and Communications, Banff, Canada, Sep. 2-4, pp808-813, 2011) to solve this problem, a scheduling model for large-scale graph data processing tasks in cloud computing environment was established, and a task scheduling algorithm to reduce costs was designed, but the following problems still exist: the algorithm reduces resource The scheduling length of the task is prolonged while leasing overhead, and the obtained mapping scheme is not the optimal solution, and there is still a large room for improvement in its solution quality

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Large-scale processing task scheduling method for income driving under cloud environment
  • Large-scale processing task scheduling method for income driving under cloud environment
  • Large-scale processing task scheduling method for income driving under cloud environment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0042] Below in conjunction with accompanying drawing, the present invention is described in further detail:

[0043] Aiming at the powerful computing power and flexible pricing mode of cloud computing, the present invention establishes a large-scale graphical data processing task scheduling model, and designs a multi-objective optimization function of execution time and resource leasing cost according to the Pareto optimal theory, and then proposes a A revenue-driven large-scale processing task scheduling method in a cloud environment, which is a new large-scale graph processing task scheduling method based on particle swarm optimization (Large GraphProcessing Based on Particle Swarm Optimization in the Cloud, referred to as LGPPSO (which is the invention method abbreviation).

[0044] The formal description of task scheduling problem for large-scale graph data processing is as follows:

[0045] (1) Cloud computing virtual resource billing model: The underlying provider of c...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a large-scale chart-shaped data processing task scheduling method for resource cost driving under cloud computing environment and belongs to the field of distributed computing. The method comprises the following steps of: 1, reading in a large-scale chart-shaped data processing task chart, traversing the large-scale chart-shaped data processing task chart according to hierarchical relationship of chart-shaped structure tasks, performing task serial number marking according to hierarchy and computing the total number n of the tasks; 2, reading in the performance and valuation mode of a virtual machine under the current cloud computing environment; 3, initializing codes of m particles as well as the positions Xi and flying speeds Vi of the particles, and setting the maximal number of iterations as T; 4, computing fitness functional values f(Xi) of all the particles by utilizing a lower fitness function according to the current codes of the particles; and 5, performing speed updating and position updating according to the fitness functional values of the particles.

Description

technical field [0001] The invention belongs to the field of distributed computing, and in particular relates to a revenue-driven large-scale processing task scheduling method in a cloud environment. Background technique [0002] In recent years, with the popularization of the Internet and the promotion of Web 2.0 technology, many applications involve large-scale graph data processing, such as: traffic route maps, scientific literature citation graphs and social networks, etc. (please refer to G.Malewicz, M.H.Austern .A.J.Bik, J.C.Dehnert, I.Horn, N.Leiser, and G.Czajkowski. "Pregel: a system for large-scale graph processing," SIGMOD'10, pp.135-146, 2010. and R.Chen, X. Weng, B. He, and M. Yang. "Large graph processing in the cloud. SIGMOD '10," pp. 1123-1126, 2010). Due to the ever-increasing graph size, the demand for computing power far exceeds the processing power of local data centers. At this time, it is necessary to continuously increase the infrastructure investmen...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/46G06F9/455
Inventor 苏森双锴李健徐鹏王玉龙
Owner BEIJING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products