Unlock instant, AI-driven research and patent intelligence for your innovation.

Big data engine prototype based on multicore heterogeneous CPU-GPU-FPGA (central processing unit, graphics processing unit and field-programmable gate array)

A CPU-GPU-FPGA, big data technology, applied in electrical digital data processing, various digital computer combinations, digital computer components, etc., can solve the problem of underutilization, low operating frequency, overall performance chip area ratio and High performance/power ratio issues

Inactive Publication Date: 2017-10-13
SHANGHAI DATACENT SCI CO LTD
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] Although dedicated computing units such as GPUs have low operating frequency, more cores and parallel computing capabilities, the overall performance / chip area ratio and performance / power consumption ratio are high, but they are far from being fully utilized.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Big data engine prototype based on multicore heterogeneous CPU-GPU-FPGA (central processing unit, graphics processing unit and field-programmable gate array)

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0018] This big data engine prototype based on multi-core heterogeneous CPU-GPU-FPGA includes the following steps:

[0019] 1) It has independent parallel CPU and parallel GPU, has its own magnetic random access memory subsystem, both of which can access each other's magnetic random access memory, and are outside the FPGA structure;

[0020] 2) The GPU is connected to the chipset through the I / O bus, and then connected to the CPU through the I / O bridge;

[0021] 3) CPU consists of ALU, register file, smart flash cache and bus interface;

[0022] 4) Perform system transplantation to support the multi-core heterogeneous CPU-GPU-FPGA big data engine architecture to form a big data engine prototype.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a big data engine prototype based on multicore heterogeneous CPU-GPU-FPGA (central processing unit, graphics processing unit and field-programmable gate array); an architecture of the prototype is characterized by including independent parallel CPU and parallel GPU having their respective magnetic random storage subsystems, and the subsystems can access their magnetic random storages mutually and are positioned outside an FPGA structure; the GPU is connected to a chip set through I / O (input / output) bus and is connected with the CPU through an I / O bridge; the CPU is composed of an ALU (arithmetic logic unit, a register file, an intelligent flash cache and a bus interface. System implanting is performed to support the multicore heterogeneous CPU-GPU-FPGA big data engine architecture, forming a big data engine prototype.

Description

technical field [0001] The invention relates to a prototype of a big data engine based on multi-core heterogeneous CPU-GPU-FPGA Background technique [0002] With cloud computing and virtualization, showing the characteristics of "large scale", "high density", "high energy consumption" and "complexity", the construction and development of a new generation of data centers and the improvement of data center infrastructure management will become increasingly Importantly, the infrastructure integration management and intelligence of the data center will become a new trend in the development of the data center. [0003] The ultra-large data center provides the entire application service from infrastructure to subsequent data analysis, screening, and application. Not only data analysis, but also cloud computing dedicated to smart manufacturing, which is different from the generalized services provided by public clouds, and supercomputing, which put forward higher requirements for...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F15/167G06F15/17
CPCG06F15/167G06F15/17
Inventor 张军徐苛陈晓峰
Owner SHANGHAI DATACENT SCI CO LTD