Unlock instant, AI-driven research and patent intelligence for your innovation.

Apparatus and method for matrix multiplication using in-memory processing

A matrix multiplication and memory array technology, applied in the field of memory processing, can solve problems such as heavy communication traffic and affecting system energy efficiency

Pending Publication Date: 2022-03-08
北京苹芯科技有限公司
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, data generated by deep convolutional neural networks (DCNNs) in conventional von Neumann architectures leads to heavy communication between memory and computational units and adversely affects the energy efficiency of these systems

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Apparatus and method for matrix multiplication using in-memory processing
  • Apparatus and method for matrix multiplication using in-memory processing
  • Apparatus and method for matrix multiplication using in-memory processing

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0041] While configurations and arrangements of the present invention have been discussed, it should be understood that this discussion is for illustration purposes only. It will be appreciated by those skilled in the art that other configurations and arrangements may be used without departing from the spirit and scope of the present disclosure. It will be apparent to those skilled in the art that the present invention can be used in a variety of other applications as well.

[0042] It should be noted that references to "one embodiment", "an embodiment", "exemplary embodiment", "some embodiments" and the like in the description of the present invention mean that the described embodiments may include specific features, structures or characteristics, but not every embodiment necessarily includes that particular feature, structure or characteristic. Moreover, such expressions do not necessarily refer to the same embodiment. Furthermore, when a particular feature, structure or c...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Embodiments of an apparatus and method for matrix multiplication using in-memory processing (PIM) are disclosed herein. In one example, an apparatus for matrix multiplication includes an array of PIM blocks in the form of rows and columns, a controller, and an accumulator. Each PIM block is configured to be in a compute mode or a memory mode. The controller is configured to divide the PIM block array into: a first set of PIM blocks, each PIM block configured to be in a memory mode; and a second set of PIM blocks, each PIM block configured to be in a compute mode. The first set of PIM blocks is configured to store a first matrix, and the second set of PIM blocks is configured to store a second matrix and calculate a partial sum of a third matrix based on the first and second matrices. The accumulator is configured to output the third matrix based on the partial sum of the third matrix.

Description

technical field [0001] Embodiments of the present disclosure relate to processing-in-memory (PIM). Background technique [0002] Ultra-low power machine learning processors are crucial for performing cognitive tasks in embedded systems because the power budget is limited, for example in the case of batteries or energy harvesting sources. However, data generated by deep convolutional neural networks (DCNNs) in conventional von Neumann architectures leads to heavy communication between memory and computing units and adversely affects the energy efficiency of these systems. As a promising solution to speed up DCNN execution, non-volatile PIM based on resistive random access memory (ReRAM) has emerged. The high cell density of ReRAM allows the realization of large on-chip ReRAM arrays to store the parameters of DCNNs, while suitable functions such as vector matrix multiplication (VMM) can be directly performed in the ReRAM array and its peripheral circuits. Contents of the in...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G11C8/12
CPCG06F17/16G06N3/063Y02D10/00G06N3/045G06F9/5061G06F9/5027
Inventor 郑琪霖
Owner 北京苹芯科技有限公司