Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Dispatching management method for task queue priority of large data set

A task queue and scheduling management technology, which is applied in the field of task queue priority scheduling management, can solve the problems of underutilization of working nodes, lower queue performance, high load, etc., and achieve resource waste prevention, efficient use of computing resources, and simple architecture Effect

Inactive Publication Date: 2016-04-20
HYLANDA INFORMATION TECH
View PDF6 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] At present, the problems faced in this situation are: when using ordinary queues, adjusting the task priority in the list type queue will greatly reduce the performance of the queue; while using simple multi-queue, it is necessary to set up multiple queues with different priorities , to achieve priority scheduling by changing the queue where the task is located, but when the high priority queue is piled up, other priority queues will be completely blocked, so many working nodes are not fully utilized, and some nodes will be overloaded

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Dispatching management method for task queue priority of large data set
  • Dispatching management method for task queue priority of large data set
  • Dispatching management method for task queue priority of large data set

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0022] Hereinafter, the present invention will be described in detail through the drawings and specific embodiments.

[0023] Such as Figure 1 to Figure 3 As shown, in the task queue priority scheduling management method of the large data set of the present invention, each node of the task acquisition end runs a cyclic thread worker process for acquiring tasks, and the cyclic thread includes the following steps:

[0024] A. Start the thread and get the task queue, namely figure 2 GetQueue in;

[0025] B. Run different algorithms through the execution engine to calculate the number of tasks acquired by each queue;

[0026] C. Get tasks from the queue server QueueServer corresponding to the task queue, namely getTask;

[0027] D, execute the task execute;

[0028] E. After the task is completed, it is judged whether the thread is stopped, and the loop goes to step A to obtain the task queue again.

[0029] The priority of tasks is preset in the task queue. The priority of the queue is c...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a dispatching management method for task queue priority of a large data set. According to the dispatching management method, a circulating thread for task acquisition is operated at each node of a task acquisition end, tasks are acquired from different queues according to priority of the queues, the tasks can be preferentially acquired from the high-priority queues, and meanwhile, the condition that the low-priority tasks cannot be blocked by the high-priority tasks is also taken into consideration. The queue service can support multiple queues of multiple types, the priority of each queue can be set, and average operating time of one queue can be acquired according to operating records; the number of the tasks acquired from each queue is dynamically adjusted according to multiple conditions including the task priority, queue length, the average operating time of each queue, the maximum number of the tasks acquired every time and the like, the tasks are dispatched reasonably, and resource waste is prevented.

Description

Technical field [0001] The invention relates to the technical field of big data processing, in particular to a scheduling management method for task queue priorities of big data sets. Background technique [0002] With the advent of the big data era, it has derived its own unique architecture, and it has also directly promoted the development of storage, network and computing software technologies. As the smallest unit in data processing, the number of tasks has also shown explosive growth. Similarly, the task queue that carries tasks faces the scheduling problem of task priority. [0003] At present, the problems faced in this situation are: when using a normal queue, adjusting the task priority in a list type queue will greatly reduce the performance of the queue; when using a simple multiple queue, you need to set up multiple queues with different priorities. , Priority scheduling is achieved by changing the queue where tasks are located, but when high priority queues accumulat...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/48G06F9/50
CPCG06F9/4881G06F9/5038G06F2209/5018
Inventor 于鑫周祖胜
Owner HYLANDA INFORMATION TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products