Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A load balancing method and apparatus based on CPU-GPU

A load distribution and current load technology, applied in the computer field, can solve problems such as limited GPU memory capacity, unbalanced CPU and GPU task distribution, and inability to process large data sets, achieving the best overall performance and improving system performance.

Active Publication Date: 2019-01-15
EAST CHINA NORMAL UNIV
View PDF4 Cites 19 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] One purpose of this application is to provide a load balancing method and device based on CPU-GPU, to solve the problem that the GPU memory capacity in the prior art is limited, the processing of large data sets cannot be completed through a single load, and the load balance between the CPU and the GPU Unbalanced task distribution leads to insufficient utilization of heterogeneous processor resources

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A load balancing method and apparatus based on CPU-GPU
  • A load balancing method and apparatus based on CPU-GPU
  • A load balancing method and apparatus based on CPU-GPU

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0055] The application will be described in further detail below in conjunction with the accompanying drawings.

[0056] In a typical configuration of the present application, the terminal, the device serving the network and the trusted party all include one or more processors (CPUs), input / output interfaces, network interfaces and memory.

[0057] Memory may include non-permanent storage in computer readable media, in the form of random access memory (RAM) and / or nonvolatile memory such as read only memory (ROM) or flash RAM. Memory is an example of computer readable media.

[0058] Computer-readable media, including both permanent and non-permanent, removable and non-removable media, can be implemented by any method or technology for storage of information. Information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random acce...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The object of the present application is to provide a CPU-GPU based load balancing method and apparatus. A pipelined query execution model is constructed on the GPU-GPU heterogeneous database system,so that CPU-GPU heterogeneous data analysis system can support query analysis in big data scenario; determining the total number of pipelines to be executed; starting the pipeline query execution model to allocate the pipeline corresponding to the total number to the CPU and the GPU, and calculating the system execution time corresponding to all load distribution strategies according to the determined execution time of a single pipeline on the CPU and the GPU respectively; finally, the load distribution strategy corresponding to the minimum value of the execution time of all the systems is determined as the optimal CPU-GPU allocation strategy,the load balancing strategy of the CPU-GPU heterogeneous data analysis system can reasonably distribute pipeline load to different processors, make full use of processor computing resources, not only improve the system performance, but also make the system achieve the best overall performance.

Description

technical field [0001] The present application relates to the computer field, and in particular to a CPU-GPU-based load balancing method and device. Background technique [0002] A general-purpose graphics processing unit (Graphics Processing Unit, GPU) is widely used in many fields such as matrix calculation and machine learning. In recent years, the demand for data-intensive applications has grown rapidly, which has promoted the development of GPU-based heterogeneous online analysis and processing platforms. Since the GPU has multiple computing units that can run a large number of threads at the same time, the GPU is used as the main processor for data processing. The performance of the analysis system is in most cases better than that of traditional CPU analysis systems, and its execution time is reduced by several orders of magnitude. [0003] In a traditional relational query analysis system, when a client sends a query request, the system creates a new analysis job, p...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/50G06T1/20
CPCG06F9/505G06T1/20
Inventor 翁楚良孙婷婷黄皓王嘉伦
Owner EAST CHINA NORMAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products