Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Heterogeneous multi-core system-oriented process scheduling method

A process scheduling, heterogeneous multi-core technology, applied in the direction of multi-programming device, resource allocation, etc., can solve the problems of slow response speed and increase of total time, and achieve the effect of load balancing

Inactive Publication Date: 2007-08-08
ZHEJIANG UNIV
View PDF0 Cites 42 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

On the other hand, the length of the time slice is fixed, so as the number of processes in the ready queue increases, the total time for one round increases, that is, the response speed to each process slows down

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Heterogeneous multi-core system-oriented process scheduling method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0028] 1) Implementation of process allocation strategy

[0029] In the heterogeneous multi-core system, all processing is not exactly the same, but consists of a master general-purpose processing core and several identical auxiliary processing cores, but all processing is the same in terms of data operations, that is The main memory and I / O devices are accessed in the same way, all of which are processed separately for the main processing core, while other multiple auxiliary processing cores are treated as a processing core pool.

[0030] Adopting a dynamic allocation strategy, the operating system maintains a common ready queue for all processing cores, and each ready process has a flag to mark whether the process is running on the main processing core or the auxiliary processing core. When a processing core is idle, a ready process is selected from the corresponding ready queue to run on the processing core. As shown in the figure, the ready queue model is outlined.

[00...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a process scheduling method for heterogeneous multi-core system. The method is implemented in the way that all the processes share the same queue for ready processes. Each process has a flag to indicate that the process is running in the main core or assistant core. The operating system in the method uses client-management structure. The core of the operating system runs in the main process, other processes deal with a variety of applications that only can run on it. When applications request the operating system services, the request would be passed on to the procedure on the main operating system. The process scheduler uses a split load scheduling algorithm and the processor allocation algorithm to achieving a reasonable process scheduling.

Description

technical field [0001] The invention relates to the field of computer operating systems, in particular to a process scheduling method for heterogeneous multi-core systems. Background technique [0002] In an operating system, process scheduling is responsible for dynamically assigning processors to processes. Therefore, it is also called processor scheduling or low-level scheduling. The program that implements process scheduling in the operating system is called the process scheduler, or dispatcher. [0003] There are many processor scheduling strategies; the following are introduced here: [0004] 1. First come first serve algorithm [0005] The first-come-first-serve algorithm allocates processors according to the order in which processes enter the ready queue. The process that enters the ready queue first is selected first. Once the running process occupies the processor, it will continue to run until the end of the run or block. This algorithm is easy to implement, ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/50
Inventor 陈天洲黄振宝
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products