Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A fine-grained task scheduling method in cloud environment

A task scheduling, fine-grained technology, applied in multi-programming devices, program control design, instruments, etc., to achieve the effect of improving throughput and solving high-latency problems

Active Publication Date: 2020-04-24
SOUTHEAST UNIV
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] The technical problem to be solved by the present invention is to provide a fine-grained task scheduling method in a cloud environment, which can effectively solve the high delay problem of centralized scheduling for fine-grained tasks and improve throughput

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A fine-grained task scheduling method in cloud environment
  • A fine-grained task scheduling method in cloud environment
  • A fine-grained task scheduling method in cloud environment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] Such as figure 1 and figure 2 As shown, a fine-grained task scheduling method in a cloud environment includes the following steps:

[0030] (1) Divide the job into fine-grained tasks in a certain way, judge the priority and resource constraints of the fine-grained tasks, and schedule the tasks to different machines and different machines according to the priority and whether the resources are limited On the queue; the job that needs to be submitted by the user is assigned a scheduler, according to the architecture type of the job, the architecture type is marked, and the priority is marked; the job is divided into stages according to the order of execution, and a directed acyclic graph is scheduled. Fine-grained tasks, each stage contains a task set of several tasks;

[0031] (2) Different architecture executors are preset on each machine. The preset architecture is the processing data model in Spark or the processing data model in MapReduce; Queue up in the queue a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a fine-gained task scheduling method under a cloud environment. The method comprises the following steps of: 1, setting jobs to be fine-gained tasks according to a certain mode, judging priorities and resource limitation situations of the fine-gained tasks, and scheduling the tasks to different machines and different queues in the machines according to the priorities and the condition that whether the resources are limited; and 2, presetting different framework executors on the machine, after the machines receive the tasks, distributing the tasks to the corresponding queues having the same frameworks with the tasks to queue up and wait for being executed by the executors. The method provided by the invention has the beneficial effects that a fine-gained task decentralization scheduling method is provided to effectively solve the problem that the fine-gained tasks are high delayed when being scheduled in a centralization manner, long-tail phenomena are avoided, and throughput capacity is accordingly improved.

Description

technical field [0001] The invention relates to the field of cloud computing resource allocation / scheduling, in particular to a fine-grained task scheduling method in a cloud environment. Background technique [0002] Large-scale data analysis frameworks are increasingly biased toward shorter task execution times and higher parallelism to provide lower latency. Some high-performance applications need internal high-throughput services to satisfy thousands of user requests per second to optimize user experience, so it is very important to respond to these requests with low latency. For example, user-facing services can run more complex parallel computing, language translation, highly personalized search, etc. [0003] There are already many data analysis frameworks to analyze big data, such as Dremel, Impala, and Spark, etc. They all continue to reduce the response time, which can reach the second level. [0004] Jobs composed of many extremely short sub-second tasks face gr...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F9/48G06F9/50
CPCG06F9/4881G06F9/5027G06F9/5072G06F2209/484G06F2209/503
Inventor 李小平倪春泉朱夏胡苇陈龙
Owner SOUTHEAST UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products