Bus arrays for reduced storage overhead

A storage overhead and array technology, applied to instruments, biological neural network models, calculations, etc., can solve problems such as area, cycle, timing waste, capacity waste, etc., to reduce memory capacity, reduce storage power consumption, and remove data redundancy Effect

Active Publication Date: 2021-06-18
BEIHANG UNIV
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, when the number of computing units and memories is large, the bandwidth requirement is large, and the memory access behavior is relatively regular, the bus-based connection method has a certain waste in terms of area, cycle, and timing.
In addition, due to various reasons, different memories often contain multiple copies of the same data segment, resulting in a waste of capacity
At the same time, during the operation of the chip, because the same data needs to be written into multiple memories and read repeatedly from multiple memories, this causes additional power consumption overhead

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Bus arrays for reduced storage overhead
  • Bus arrays for reduced storage overhead
  • Bus arrays for reduced storage overhead

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032] In order to make the purpose, technical solution and advantages of the present invention clearer, the present invention will be further described in detail in combination with specific embodiments and accompanying drawings.

[0033] Aiming at the problem of extra storage overhead caused by saving the same data in multiple independent memories, the present invention proposes a bus array for reducing storage overhead.

[0034] figure 1 A CE schematic of one embodiment of the invention is shown. Among them, each CE receives data input from outside the array through the C port; exchanges data with adjacent CEs through the A and B ports; and outputs data to the outside of the array through the D port. figure 2 A schematic diagram of a CE array showing one embodiment of the present invention is shown.

[0035] image 3 A schematic diagram showing a typical data transmission requirement for processing by the present invention, where x 0 、x 1 、x 2 with x 3 Represents fo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The present invention provides a collective element (CE, collective element) array for reducing storage overhead, including a plurality of CEs, each CE contains a local memory, each CE can receive data from outside the array or adjacent CEs, each CE can temporarily store the received data in the internal cache or output it out of the array. The invention broadcasts the data in one memory to multiple output ports through the data transfer inside the CE array, thereby effectively removing the data redundancy between different memories. And while reducing memory capacity requirements, additional power consumption caused by reading the same data from different memories is reduced.

Description

technical field [0001] The invention relates to the technical field of computer architecture, in particular to a collective element array for reducing storage overhead. Background technique [0002] In recent years, deep learning has gradually achieved more and more remarkable results in fields such as image recognition and speech processing. However, as the depth of the network continues to increase, the computing power and memory access bandwidth required in the training and inference process of the deep neural network are gradually difficult to be met by traditional computing platforms. Therefore, various domain-specific architectures (domain specified architectures) applied to neural networks have been proposed by the industry and academia to meet this requirement. Among them, the systolic array architecture has attracted great attention from the industry and academia due to its characteristics of high concurrency and low bandwidth requirements. Based on this, the inve...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06F1/3234G06N3/063
CPCG06F1/3275G06N3/063
Inventor 杨建磊赵巍胜付文智
Owner BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products