Task hierarchy scheduling method and device based on execution time prediction

A technology of execution time and task scheduling, applied in the direction of multi-program device, program startup/switching, program control design, etc., can solve problems such as computing resource load balancing

Pending Publication Date: 2021-12-10
物产中大公用环境投资有限公司
View PDF0 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The present invention provides a task-level scheduling method based on execution time prediction, which aims to solve the existing ...

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Task hierarchy scheduling method and device based on execution time prediction
  • Task hierarchy scheduling method and device based on execution time prediction
  • Task hierarchy scheduling method and device based on execution time prediction

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0050] Such as figure 1 As shown, a task-level scheduling method based on execution time prediction includes the following steps:

[0051] S110. Create a task scheduling model, the task scheduling model includes a leaf queue, and the leaf queue includes a plurality of task sets;

[0052] S120. Obtain the feature vector of each task set, and predict the execution time required for each task set according to the feature vector and the pre-built time prediction model, so as to obtain the execution time of each sub-queue;

[0053] S130. Score each of the sub-queues according to a preset scoring mechanism, and select a sub-queue with a high score and a task set with a short execution time in the sub-queue for scheduling.

[0054] According to Embodiment 1, it can be seen that a multi-layer scheduling model is established according to the queue priority and queue resource limit set by the user. The whole model consists of multiple non-leaf queues and leaf queues. The task set submi...

Embodiment 2

[0058] Such as figure 2 As shown, a task-level scheduling method based on execution time prediction, including:

[0059] S210. Create a task scheduling model, where the task scheduling model includes a leaf queue, and the leaf queue includes multiple task sets;

[0060] S220. Acquire the feature vector of each task set, and predict the execution time required for each task set according to the feature vector and the pre-built time prediction model, so as to obtain the execution time of each leaf queue;

[0061] S230. Combine multiple base learners into a new base learner according to an ensemble learning method;

[0062] S240. Use a regression algorithm to use the output of the new base learner as the input of the secondary learner to construct a time prediction model, and the integrated learning includes the new base learner and the secondary learner;

[0063] S250. Score each sub-queue according to a preset scoring mechanism, and select a sub-queue with a high score and a...

Embodiment 3

[0067] Such as image 3 As shown, a task-level scheduling method based on execution time prediction, including:

[0068] S310. The task scheduling model further includes a root queue and a non-leaf queue, wherein the root queue, the non-leaf queue, and the leaf queue are in a tree structure; each sub-queue is scored according to a preset scoring mechanism, Starting from the root queue, traversing from top to bottom to select sub-queues with high scores;

[0069] S320. Select a task set with the shortest execution time from the sub-queue for scheduling;

[0070] S330. The task set further includes a plurality of tasks, and the execution time of each task on a determined computing node is predicted according to the time prediction model, and the node is a carrier for processing tasks;

[0071] S340. Set the target optimization function min(y+load). When the target optimization function converges, select and perform task scheduling with the nearest node according to the network...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a task hierarchy scheduling method and a device based on execution time prediction. The method comprises the steps: building a multi-layer scheduling model according to the queue priority and queue resource limitation set by a user, obtaining the features of each task set in leaf queues from each leaf queue when a calculation node has idle resources, and predicting the execution completion time of the current task set under the condition of current system load and leaf queue resource limitation, and calculating the predicted completion calculation time of each queue from bottom to top according to the hierarchical model; then scoring the sub-queues according to the formulated scoring mechanism from the root queue from top to bottom, then sorting according to scores, selecting the front queue, and repeating the operation until the leaf queue; in the selected leaf queue, selecting a task set with relatively short prediction execution completion time for scheduling. The selected task set comprises a plurality of tasks, and predicting the time required for completing the operation of the tasks on each node;.

Description

technical field [0001] The present invention relates to the technical field of big data processing task scheduling, and in particular to a task-level scheduling method and device based on execution time prediction. Background technique [0002] Every company that stores user data needs to analyze the stored data to make decisions that determine the direction of the business. In the case of small amounts of data, traditional data analysis tools can do the job. However, in recent years, with the increase of various data, once the amount of data reaches TB and PB levels, the processing capacity of traditional data processing tools is given priority. At this time, various companies will use big data analysis technology to process data. That is, different types and different numbers of big data tasks such as Spark tasks are run on the computing cluster to process user data. When the cluster resources are sufficient and the number of computing tasks is small, Spark's original de...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F9/48G06N20/00
CPCG06F9/4881G06F9/485G06N20/00
Inventor 陈健陈天祥
Owner 物产中大公用环境投资有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products