Resource management and job scheduling method and system combining pull mode and push mode

A technology for resource management and job scheduling, applied in resource allocation, electrical digital data processing, multi-programming devices, etc., can solve the problems affecting system operation efficiency and scalability, central scheduler system performance bottlenecks, and system resource heterogeneity. Enhancement and other issues to achieve the effect of improving resource utilization, reducing bottleneck effect, and promoting scalability

Active Publication Date: 2022-04-26
SUN YAT SEN UNIV
View PDF1 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The so-called centralized means that the maintenance of job load information, the management of job queues, the maintenance of system resource information, and the job scheduling and allocation to idle resources are all undertaken by a specific central server or daemon process. There is a serious hidden danger in the architecture: the central scheduler can easily become the performance bottleneck and single point of failure of the entire system
Especially when the scale of the system expands, the type of workload becomes more and more complex, and the heterogeneity of system resources is further enhanced. If the centralized architecture is continued to be used and only the master node is used to complete all scheduling tasks, it will seriously affect the performance of the entire system. Operational Efficiency and Scalability

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Resource management and job scheduling method and system combining pull mode and push mode
  • Resource management and job scheduling method and system combining pull mode and push mode

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0043] Such as figure 1 As shown, the implementation steps of the resource management and job scheduling method combining the Pull mode and the Push mode in this embodiment include:

[0044] 1) Receive jobs;

[0045] 2) Analyze or identify the job, and determine the type of job as high-performance computing job or big data processing job; it should be noted that different types of jobs can be stored in a mixed or separate manner according to needs, and the storage method can also be stored according to needs Use the required storage form, such as queue, linked list, etc.;

[0046] 3) Scheduling for different types of jobs, and for the high-performance computing jobs obtained by scheduling, use the Push mode to distribute: assign computing nodes to the high-performance computing jobs, and push the high-performance computing jobs to the assigned computing nodes for execution ; For the scheduled big data processing jobs, use the Pull mode to distribute: wait for the job request...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a resource management and job scheduling method, system and medium combining a Pull mode and a Push mode. The invention includes analyzing or identifying a type of job and scheduling by type. Distribute in pull mode: allocate computing nodes for the high-performance computing job, push the high-performance computing job to the assigned computing node for execution; for big data processing jobs, use Pull mode for distribution: wait for the job request from the computing node, and When the computing node actively sends a job request, the big data processing job is sent to the computing node that actively sends the job request for execution. The present invention can disperse the workload of the master node in the traditional system, reduce the bottleneck effect caused by it, and improve the resource utilization rate of the system. The present invention has good versatility, high resource utilization rate, high system throughput rate and good scalability The advantages.

Description

technical field [0001] The invention relates to the field of resource management and job scheduling of a large-scale computer system, in particular to a resource management and job scheduling method, system and medium combining a Pull mode and a Push mode. Background technique [0002] Resource management and job scheduling have always been a challenging problem in large-scale computing systems. At present, the scale of computing systems is getting larger and larger. For example, the Sunway TaihuLight supercomputer is composed of 40,000 computing nodes, and the number of processor cores reaches tens of millions. Shows heterogeneity, for example, the fastest Summit supercomputer currently contains 6 GPUs per node; since large-scale computing has become the basic means to promote technological progress in various industries, a large number of jobs with different characteristics are submitted to the computing system, lead to disorderly competition for resources. In the above ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06F9/48G06F9/50
CPCG06F9/4881G06F9/5016G06F9/5027G06F2209/501G06F9/505
Inventor 陈志广卢宇彤
Owner SUN YAT SEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products