Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Massive working node-oriented preemptive task scheduling method and system

A work node and task scheduling technology, applied in the direction of program startup/switching, resource allocation, inter-program communication, etc., can solve the problems of low algorithm complexity and difficult realization of computing resources, so as to optimize system performance, improve scheduling efficiency, improve The effect of efficiency

Inactive Publication Date: 2018-11-06
SICHUAN FEIXUN INFORMATION TECH CO LTD
View PDF7 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0009] 3. Limited working nodes
Although the complexity of this scheduling algorithm has reached the O(Nlog(N)) level, it is still difficult to implement in practical applications with existing computing resources
[0016] At present, no algorithm complexity is low, and a public scheduling scheme that combines fair scheduling of massive working nodes and task priority preemptive scheduling has not been found.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Massive working node-oriented preemptive task scheduling method and system
  • Massive working node-oriented preemptive task scheduling method and system
  • Massive working node-oriented preemptive task scheduling method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0043] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings. Obviously, the described embodiments are only some embodiments of the present invention, rather than all embodiments . Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0044] The present invention discloses a preemptive task scheduling method oriented to a large number of working nodes, the embodiment of which is as follows figure 1 shown, including:

[0045] S101 acquires a task to be allocated from the queue of currently highest priority tasks to be allocated;

[0046] S102 judging whether there are idle working nodes in the current idle working node pool;

[0047] S103 When there is an idle working ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a massive working node-oriented preemptive task scheduling method and system. The method comprises the steps of: obtaining a to-be-assigned task from a current to-be-assigned task queue with the highest priority; judging whether an idle working node exists in a current idle working node pool or not; if yes, assigning the to-be-assigned task to the working node selected fromthe idle working node pool; if no, selecting a target busy working node from a busy working node pool, wherein the priority of at least one task in tasks performed by the target busy working node islower than the priority of the to-be-assigned task; recycling a task, which has the priority lower than the priority of the to-be-assigned task, of the busy working node and assigning the to-be-assigned task to the busy working node; and increasing the priority of the recycled task by one level and placing in a to-be-assigned task queue with the corresponding priority. Through the method disclosedby the invention, the scheduling of the mass working nodes and the task priority is realized; the scheduling complexity is reduced; and the work efficiency of the scheduling is improved.

Description

technical field [0001] The invention relates to the field of task scheduling, in particular to a preemptive task scheduling method and system for massive working nodes. Background technique [0002] With the continuous development of the sharing economy and cloud computing, the future task scheduling is no longer task scheduling for limited working nodes in the enterprise, but task scheduling for platform Shanghai (millions and tens of millions) working nodes. Task requirements have also changed from planable or coordinated scheduling tasks to rigid priority requirements driven by bidding. [0003] Existing task scheduling methods mainly include the following two aspects: [0004] Scheduling a large number of working nodes: Currently, the LFU algorithm is generally used for load balancing and preemptive scheduling. [0005] The implementation scheme based on task priority scheduling: generally adopt the completely fair strategy (CFS) or use a priority sorting structure, su...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F9/48G06F9/50G06F9/54
CPCG06F9/4881G06F9/5027G06F9/5072G06F9/546G06F2209/484G06F2209/5011G06F2209/548
Inventor 王衡
Owner SICHUAN FEIXUN INFORMATION TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products