Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method and apparatus for scheduling GPU to perform batch operation

A technology of batch and calculation results, applied in the field of GPU parallel computing, can solve problems such as increasing program complexity and difficulty, and achieve the effect of improving memory access performance

Inactive Publication Date: 2016-01-06
成都卫士通信息产业股份有限公司
View PDF5 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Even if multiple threads are used to call the GPU, each thread assigns the task to the GPU to only schedule one GPUCORE to participate in the calculation. Compared with batching the tasks to the GPU and letting each core of the GPU participate in the calculation, the performance gap may be different. thousand times more than the latter
[0005] Our applications based on the CPU architecture usually use multi-process or multi-thread to process multiple tasks. After receiving a task, it will call the CPU to perform a calculation, instead of caching the tasks together to call and execute the calculation in batches. , because this will not bring performance improvement, but will greatly increase the complexity of the program
The use of GPU is different. GPU needs to submit computing tasks to it in batches in order to give full play to its performance. However, it is quite difficult to transform existing applications into a mode of submitting tasks to GPU in batches through cache.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and apparatus for scheduling GPU to perform batch operation
  • Method and apparatus for scheduling GPU to perform batch operation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0028] All features disclosed in this specification, or steps in all methods or processes disclosed, may be combined in any manner, except for mutually exclusive features and / or steps.

[0029] Any feature disclosed in this specification (including any appended claims, abstract and drawings), unless expressly stated otherwise, may be replaced by alternative features which are equivalent or serve a similar purpose. That is, unless expressly stated otherwise, each feature is one example only of a series of equivalent or similar features.

[0030] Implementation steps of the present invention are as follows:

[0031] 1. Using the synchronous mode API:

[0032] a) Deploy GPUs;

[0033] b) deploy and run the GPU scheduling module;

[0034] c) The application module calls the synchronous mode API to perform calculation tasks.

[0035] 2. Use the asynchronous mode API:

[0036] a) Deploy GPUs;

[0037] b) deploy and run the GPU scheduling module;

[0038] Transform the applica...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention relates to the field of GPU parallel computing, in particular to a method and an apparatus for scheduling a GPU to perform batch operation. Aimed at the problems in the prior art, the present invention provides the method and the apparatus for scheduling the GPU to perform the batch operation. According to the method and the apparatus provided by the present invention, a separate GPU scheduling module is designed and coordinates with the GPU to perform calculation task processing together with an API application module so as to sufficiently exert the operational capability of the GPU; the API application module sends an operation task to the GPU scheduling module and the GPU scheduling module stores calculation tasks received within a cycle in a cache; and when the GPU completes the calculation tasks in the last batch, the GPU scheduling module submits the cached calculation tasks to the GPU in batches and then the API application module completes the subsequent operations of the calculation tasks in a synchronous mode or an asynchronous mode.

Description

technical field [0001] The invention relates to the field of GPU parallel computing, in particular to a method and device for scheduling GPU to perform batch computing. Background technique [0002] A GPU (Graphics Processing Unit, graphics processing unit) can be understood as a programmable graphics card, which is used for processing graphics images in a computer. After development in recent years, GPU is not limited to graphics and image processing, but has also been applied to the field of large-scale parallel computing. Using GPU parallel computing technology may improve the performance of algorithms several times. [0003] A single GPU usually has hundreds or thousands of COREs (core computing units), far exceeding the number of CPU COREs. GPUs are very suitable for performing intensive computing tasks that can be highly parallelized. Compared with CPUs at the same price, GPUs use The number of cores used can be hundreds of times higher than that of the CPU, and using...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F9/50G06F9/48
Inventor 吴庆国
Owner 成都卫士通信息产业股份有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products