Unlock instant, AI-driven research and patent intelligence for your innovation.

A Data-intensive Process Scheduling Method for Memory Access

A data-intensive, process-scheduling technology, applied to multi-program devices, program startup/switching, etc., can solve the problem of memory bus competition that has not been fully considered, and avoid long-term waiting for access and make full use of system resources , Prevent the effect of process starvation

Inactive Publication Date: 2019-01-11
CHONGQING UNIV
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The current research work mainly focuses on how to make full use of the memory file system, but the problem of memory bus competition caused by the use of the memory file system has not been fully considered

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Data-intensive Process Scheduling Method for Memory Access
  • A Data-intensive Process Scheduling Method for Memory Access
  • A Data-intensive Process Scheduling Method for Memory Access

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0057] In the embodiment, it is assumed that the total system bandwidth is 15, the system has 4 CPUs, and the waiting time slice threshold is 2.

[0058] Table 1 is a list of processes in a working set

[0059] Table 1. Bandwidth requirements of each process in each time slice

[0060]

[0061] There are 5 processes in Table 1. The execution time of each process and the bandwidth requirement in each time slice are shown in the table. For example, process 1 needs to execute two time slices, and the bandwidth requirement in the first time slice is 8 , the bandwidth requirement in the second time slice is 7, and so on for other processes;

[0062] The constructed ready queue and the set priority waiting queue are shown in Table 2.

[0063] Table 2

[0064]

[0065] In Table 2, all processes are sorted from large to small according to the remaining execution time slices. At this time, the waiting time slices of all processes are 0, and the priority waiting queue is empty....

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a data-intensive process scheduling method of memory access. The data-intensive process scheduling method comprises the following steps of step 1, constructing a ready queue; step 2, constructing a scheduling queue; step 3, constructing a priority waiting queue; step 4, after a time slice is executed, judging whether a process is executed or not, if the process is executed, removing the process from all the queues, or else, putting the process into the ready queue and waiting for a next scheduling; step 5, judging whether all processes are executed or not, if yes, a current working set is finished, or else, executing a next process. The data-intensive process scheduling method provided by the invention has the following advantages: system management is optimized, system resources are fully utilized, and long wait caused by memory bus competition is avoided.

Description

technical field [0001] The invention belongs to the technical field of computer memory access, and in particular relates to a data-intensive process scheduling method. Background technique [0002] In high-performance computing and big data applications, it is a technical trend to move datasets into memory for high-speed file access. In order to take full advantage of the storage system, many in-memory file systems and in-memory databases are used to make full use of the memory bus to provide fast file read and write. Because these systems are built on non-volatile memory (NVM) or DRAM directly connected to the memory bus, these systems have a very large performance improvement compared to traditional block device-based data I / O, which is necessary for large The process of reading and writing data is very beneficial. However, since all file accesses go through the memory bus, when a large number of data-intensive processes read and write to the memory file system at the sa...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F9/48
Inventor 沙行勉吴林诸葛晴风
Owner CHONGQING UNIV