Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A fine-grained approach to GPU resource management for mixed workloads

A resource management and mixed load technology, applied in the field of fine-grained GPU resource management for mixed loads, can solve problems such as poor practicability, low operability, and inability to guarantee

Inactive Publication Date: 2020-07-10
凯习(北京)信息科技有限公司
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Based on the method of GPU preemption, although the waiting time of tasks can be reduced to a certain extent, the time overhead of preemption is related to the execution time of GPU kernel tasks.
[0014] To sum up, it can be seen that the hardware-based method needs to modify the GPU hardware structure to solve the performance problem when the load is mixed, and has low operability and poor practicability on the existing GPU equipment; although the software-based method can make online tasks Try to run with priority, but it cannot ensure that when online tasks need additional resources, the corresponding resources can be obtained in time
Therefore, a fine-grained GPU resource management method is needed to effectively control the use of GPU resources by different types of tasks under mixed loads, especially to support task resource quotas and online resource adjustments, so as to meet the requirements of service quality. technical report

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A fine-grained approach to GPU resource management for mixed workloads
  • A fine-grained approach to GPU resource management for mixed workloads
  • A fine-grained approach to GPU resource management for mixed workloads

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0086] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention. In addition, the technical features involved in the various embodiments of the present invention described below can be combined with each other as long as they do not constitute a conflict with each other.

[0087] The basic idea of ​​the present invention is to design a mixed load resource management method based on resource quota and reservation on the basis of the existing MPS mechanism through the design at the software level. When online tasks and offline tasks are mixed on the GPU, in order to ensure that online tasks can be processed in a timely manner, the resource quota mechanism is used to...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a fine-grained GPU resource management method for mixed loads. A capacity-based streaming multiprocessor abstract model CapSM is proposed and serves as a basic unit of resource management; when the mixed loads (including an online task and an offline task) share GPU resources, the use of the GPU resources by different types of tasks is managed through fine granularity, and task resource allocation and resource online adjustment are supported; and while the GPU resources are shared, the service quality of the online task is ensured. The resources allocated to the tasks are determined according to the types of the tasks, a resource request and a current GPU resource state of a system; the use of the GPU resources by the offline task can be met under the condition of sufficient resources; and when the GPU resources are insufficient, the resource use of the offline task is dynamically adjusted to preferentially meet the resource demand of the online task, so that when the mixed loads run at the same time, the performance of the online task can be ensured and the GPU resources can be fully utilized.

Description

technical field [0001] The invention relates to the field of resource management and task scheduling in heterogeneous computing, in particular to a fine-grained GPU resource management method for mixed loads. Background technique [0002] Graphics Processing Unit (hereinafter referred to as GPU) has gradually become an indispensable part of high-performance computing, cloud computing and data centers due to its powerful peak computing capabilities. The use of GPUs to accelerate key services is becoming more and more Adopted by many institutions and organizations. In order to improve the utilization rate of the GPU, the infrastructure provider usually also allows multiple different types of tasks (online tasks and offline tasks) to share GPU resources, that is, adopts a mixed load operation mode. However, when mixed workloads share the GPU, there will be severe interference in the performance of online tasks since multiple tasks will compete for GPU resources. The fundament...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F9/50
CPCG06F9/5022G06F9/505
Inventor 杨海龙禹超白跃彬栾钟治顾育豪
Owner 凯习(北京)信息科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products