Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

An fpga and opencl-based fpga algorithm for large-capacity data

A data computing and data technology, applied in the field of data computing, can solve problems such as large time resources, limited algorithm performance, complex DDR hardware, etc., and achieve the effect of acceleration

Active Publication Date: 2021-02-02
FASII INFORMATION TECH SHANGHAI
View PDF9 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In this prior art, due to the use of DDR as a cache, the storage resources are limited, and the hardware of DDR is more complicated, increasing the number of DDRs requires higher hardware requirements for FPGA, and because each time the host sends and receives data, the pcie interface link consumes The time resource is relatively large, which reduces the utilization rate of the kernel algorithm module per unit time and limits the performance of the algorithm
For algorithms that require large capacity such as neural network algorithms, image processing algorithms, etc., the above technologies cannot perform algorithmic calculations quickly

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • An fpga and opencl-based fpga algorithm for large-capacity data
  • An fpga and opencl-based fpga algorithm for large-capacity data
  • An fpga and opencl-based fpga algorithm for large-capacity data

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0039]Such asfigure 1 withfigure 2 As shown, an FPGA for large-capacity data includes an FPGA controller, a pcie interface for command communication with the FPGA controller, a Flash controller, a DDR controller, and an algorithm module; it also includes a Flash memory controlled by the Flash controller And the DDR memory controlled by the DDR controller; the Flash controller communicates with the DDR controller command communication, the DDR controller communicates with the algorithm module command communication; the pcie interface and the Flash controller communicate data between the Data transmission between the Flash controller and the DDR controller, and data transmission between the DDR controller and the algorithm module.

[0040]In this embodiment, the Flash controller 2 includes a Flash array group A controller and a Flash array group B controller; there are 96 Flash memories 5, of which 48 pieces of Flash memories 5 are connected to 12 of the Flash array group A controllers F...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an FPGA aimed at large-capacity data and an opencl-based FPGA algorithm, belonging to the technical field of data calculation. The FPGA includes an FPGA controller, a pcie interface, a Flash controller, a DDR controller and an algorithm module for command communication with the FPGA controller; it also includes a Flash memory controlled by the Flash controller and a DDR controlled by the DDR controller memory; the Flash controller communicates with the DDR controller instruction, the DDR controller communicates with the algorithm module instruction; the data transmission between the pcie interface and the Flash controller, the data between the Flash controller and the DDR controller Transmission, data transmission between the DDR controller and the algorithm module. By setting the Flash controller and the Flash memory on the original FPGA, the method enables the host to move a large amount of data to be calculated to the Flash memory through the pcie interface, and then move each data to be calculated to the DDR memory, avoiding multiple Move data through the pcie interface to achieve algorithm acceleration.

Description

Technical field[0001]The invention belongs to the technical field of data computing, and in particular relates to an FPGA for large-capacity data and an opencl-based FPGA algorithm.Background technique[0002]In the current opencl-based algorithm architecture, the host transfers data to the DDR external to the FPGA through the pcie interface. The kernel algorithm module fetches the data from the DDR, performs the algorithm logic operation, and sends the calculated data back to the DDR after the calculation, and then the host takes out the calculated data from the DDR through the pcie interface. In this prior art, due to the use of DDR as a cache, storage resources are limited, and the hardware of DDR is more complex, increasing the number of DDRs requires higher hardware requirements for FPGAs, and because each time the host sends and receives data, the pcie interface link consumes The time resources are relatively large, which reduces the utilization rate of the kernel algorithm modu...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F15/78
CPCG06F15/7842
Inventor 杨威锋云飞龙
Owner FASII INFORMATION TECH SHANGHAI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products