Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Resource sharing method and device

A technology of resource sharing and configuration information, applied in the direction of resource allocation, multi-programming device, program control design, etc., can solve the problems of waste of resources, long-term occupation of GPU resources, inability to adaptively manage GPU sharing or exclusive use, etc. The effect of improving the efficiency of use

Pending Publication Date: 2021-03-26
ZTE CORP
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, starting MPS directly through static mode will limit the use of GPU to shared
In this way, when there is no job using the shared GPU, it will lead to long-term occupation of GPU resources, resulting in waste of resources; moreover, it is impossible to adaptively manage the shared or exclusive use of the GPU

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Resource sharing method and device
  • Resource sharing method and device
  • Resource sharing method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0015] In order to make the purpose, technical solution and advantages of the application clearer, the embodiments of the application will be described in detail below in conjunction with the accompanying drawings. It should be noted that, in the case of no conflict, the embodiments in the present application and the features in the embodiments can be combined arbitrarily with each other.

[0016] The steps shown in the flowcharts of the figures may be performed in a computer system, such as a set of computer-executable instructions. Also, although a logical order is shown in the flowcharts, in some cases the steps shown or described may be performed in an order different from that shown or described herein.

[0017] The embodiment of the present application provides a resource sharing method and device, on the basis of using MPS (Multi-Process Service, multi-process service) technology, through the dynamic management of MPS Server Pod to support GPU MPS between Kubernetes con...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A resource sharing method is applied to a Kubernetes cluster, and comprises the step of starting an MPS Server Pod corresponding to a GPU on any node of the Kubernetes cluster according to the sharinguse requirement of a container or the Pod for the GPU of the node. The invention further provides a resource sharing device. According to the method and the device, the MPS Server Pod can be dynamically started in the Kubernetes cluster, and the GPU can be shared and used by the Kubernetes container or the Pod, so that the use efficiency of cluster resources is improved.

Description

technical field [0001] The embodiments of the present application relate to but are not limited to the field of computer application technology, especially a resource sharing method and device. Background technique [0002] Due to the powerful computing power of the Graphics Processing Unit (GPU), it is widely used in the fields of deep learning and high-performance computing, but the requirements for using the GPU vary in different scenarios. Taking deep learning as an example, in the training scenario, multiple GPUs need to be aggregated to provide greater computing power to accelerate model training; however, in the inference process, it is more desirable to share the same GPU among multiple applications, so as to Share the excess computing power when a single application uses the GPU. At the same time, with the popularity of container technology, more and more applications use containers and container clouds as tools for orchestrating and scheduling applications and for...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F9/50
CPCG06F9/5027G06F9/5061G06F9/50
Inventor 唐波王科文
Owner ZTE CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products