Unlock instant, AI-driven research and patent intelligence for your innovation.

A file cache-based scheduling method, device and computing device

A technology of file caching and scheduling methods, applied in computing, multi-program device, program control design, etc., can solve problems such as migration and performance loss, and achieve the effect of meeting the performance of the scene

Active Publication Date: 2022-04-12
UNIONTECH SOFTWARE TECH CO LTD
View PDF8 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] In related technologies, during load balancing, the busiest subdomain will be found in the current domain, and then the busiest CPU in the subdomain will be found, and some processes will be migrated to the current idle CPU by calculating the load to achieve load balancing. However, if the current If the load balancing domain is the CPU of a remote node (node), there may be CPU migration across remote nodes, resulting in performance loss

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A file cache-based scheduling method, device and computing device
  • A file cache-based scheduling method, device and computing device
  • A file cache-based scheduling method, device and computing device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. Although exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited by the embodiments set forth herein. Rather, these embodiments are provided for more thorough understanding of the present disclosure and to fully convey the scope of the present disclosure to those skilled in the art.

[0027] The scheduling method based on file cache of the present invention is executed in the computing device. The computing device can be any device with storage and computing capabilities, which can be implemented as a server, workstation, etc., or as a personal computer such as a desktop computer or a notebook computer, or as a mobile phone, a tablet computer, or a smart wearable device. , IoT devices and other terminal devices,...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a scheduling method, device and computing equipment based on file cache. The above method includes the steps of: among the CPUs of all nodes in the multi-core system, at least determine the process CPU and the migration target CPU based on the CPU load; when the process CPU and the migration target CPU do not meet the load balancing conditions, Traversing all processes on the CPU of the migration process, determining the process to be migrated according to the size of the file cache accessed by each process; calculating the node corresponding to the file cache accessed by the process to be migrated and the node where the migration target CPU is located The distance between them; migrating the process to be migrated whose distance is less than or equal to the distance threshold to the migration target CPU. The scheduling method based on the file cache of the present invention can enable the process to quickly use the file cache to meet the scene performance when the file cache is high.

Description

technical field [0001] The invention relates to the technical field of data scheduling, in particular to a file cache-based scheduling method, device and computing equipment. Background technique [0002] In a multi-core system, in order to make better use of the multi-CPU parallelism capability, the scheduler will schedule the processes on average to each CPU. However, with the development of technology, the number of CPUs on the machine is increasing, so the NUMA (NonUniform Memory Access, non-uniform memory access) architecture is introduced for management, such as figure 1 As shown, the NUMA system is composed of multiple CPU nodes. The entire memory system can be used as a whole and can be accessed by any processor, but the processor access to the local memory node has smaller delay and greater bandwidth, and the speed of accessing the remote memory node will be slower. [0003] In related technologies, during load balancing, the busiest subdomain will be found in the...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F16/172G06F16/182G06F9/50
CPCG06F16/172G06F16/182G06F9/505G06F9/5088
Inventor 胡翔周鹏叶中玉余昇锦
Owner UNIONTECH SOFTWARE TECH CO LTD