Scheduling and management of compute tasks with different execution priority levels

A computing task and task management technology, applied in the field of scheduling and management of computing tasks, can solve problems such as delaying the execution of computing tasks

Inactive Publication Date: 2013-04-03
NVIDIA CORP
View PDF5 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

During the execution of a computing task, an interaction between the driver and multiple processors is required to allow the driver to schedule the computing task, which may delay the execution of the computing task

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Scheduling and management of compute tasks with different execution priority levels
  • Scheduling and management of compute tasks with different execution priority levels
  • Scheduling and management of compute tasks with different execution priority levels

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0018] In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced without one or more of these specific details. In other instances, well-known features have not been described in order not to obscure the present invention.

[0019] System Overview

[0020] figure 1 is a block diagram illustrating a computer system 100 configured to implement one or more aspects of the present invention. Computer system 100 includes a central processing unit (CPU) 102 and system memory 104 communicating via an interconnection path that may include a memory bridge 105 . The memory bridge 105 may be, for example, a north bridge chip, connected to an I / O (input / output) bridge 107 via a bus or other communication path 106 (eg, a HyperTransport link). I / O bridge 107 , which may be, for example, a south bridge chip, ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

One embodiment of the present invention sets forth a technique for dynamically scheduling and managing compute tasks with different execution priority levels. The scheduling circuitry organizes the compute tasks into groups based on priority levels. The compute tasks may then be selected for execution using different scheduling schemes, such as round-robin, priority, and partitioned priority. Each group is maintained as a linked list of pointers to compute tasks that are encoded as queue metadata (QMD) stored in memory. A QMD encapsulates the state needed to execute a compute task. When a task is selected for execution by the scheduling circuitry, the QMD is removed for a group and transferred to a table of active compute tasks. Compute tasks are then selected from the active task table for execution by a streaming multiprocessor.

Description

technical field [0001] The present invention relates generally to the execution of computing tasks, and, more particularly, to the scheduling and management of computing tasks with different priorities. Background technique [0002] Conventional scheduling for executing computing tasks in a multi-processor system relies on applications or drivers to prioritize each computing task. During execution of a computing task, interaction between the driver and the multiple processors is required to allow the driver to schedule the computing task, which interaction may delay the execution of the computing task. [0003] Accordingly, what is needed in the art are systems and methods for dynamically scheduling computing tasks for execution based on processing resources and priorities of available computing tasks. Importantly, the scheduling mechanism should not depend on or require software or driver interaction. Contents of the invention [0004] Systems and methods for dynamicall...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/46G06F13/18
CPCG06F9/461G06F9/4881
Inventor 蒂莫西·约翰·珀塞尔兰基·V·姗小杰尔姆·F·德鲁克
Owner NVIDIA CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products