Supercharge Your Innovation With Domain-Expert AI Agents!

An in-memory computing device and method based on ping-pong structure

A technology of a computing device and a ping-pong structure, which is applied in the field of in-memory computing devices based on the ping-pong structure, can solve the problems of unconsidered and limited improvement of ping-pong in-memory computing macro performance, and achieves improved optimization efficiency, improved hardware utilization efficiency, and high energy efficiency. Calculated effect

Active Publication Date: 2022-08-02
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, although this solution provides in-memory computing macros with ping-pong functions, it does not consider the adapted data scheduling method and specific device solutions when such computing macros are applied to digital systems, resulting in limited performance improvement of ping-pong in-memory computing macros.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • An in-memory computing device and method based on ping-pong structure
  • An in-memory computing device and method based on ping-pong structure
  • An in-memory computing device and method based on ping-pong structure

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0068] An in-memory computing method of the above-mentioned in-memory computing device based on a ping-pong structure, comprising the following steps:

[0069] Step 1. According to the needs of the application scenario, set the specific parameters of the in-memory calculation macro, specifically: the number of rows of the in-memory calculation macro row=32, the number of columns col=256, and the number of macros num macro =4;

[0070] Step 2. Reconstruct the form of the data to be calculated. The data to be calculated is the convolution operation of the 10th layer of the VGG-16 neural network algorithm and bn=1, and the convolution operation is reconstructed, specifically:

[0071] Expand the neural network convolution operation into a matrix multiplication operation. For the 10th layer of the VGG-16 network, the input feature size is (1, 28, 28, 512), the convolution kernel size is (3, 3, 512, 512), the input feature size The convolution operation with the frame number bn of...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The purpose of the present invention is to provide an in-memory computing device and method based on a ping-pong structure, which belongs to the technical field of in-memory computing. Based on the in-memory computing macro with a ping-pong structure, the device designs appropriate specific device building modules including an input data cache module, an in-memory computing core module, a top-level control module, a memory control module, and a configuration module. The data scheduling scheme is used to complete the processing and updating of data by the device.

Description

technical field [0001] The invention belongs to the technical field of in-memory computing, and in particular relates to an in-memory computing device and method based on a ping-pong structure. Background technique [0002] With the development of neural network algorithms, the increasingly complex network structure and huge amount of parameters have brought great challenges to processors in terms of computing power and power consumption. As a potential solution, in-memory computing technology has attracted widespread attention in recent years. Computing in memory (CIM) is an emerging non-Von Neumann (Non Von Neumann) architecture technology that reduces data movement by building storage units with computing capabilities and one-time large-scale multiply-add operations. It can realize energy-efficient and large-computing hardware solutions with energy consumption per calculation. [0003] However, limited by the current hardware scale and preparation process, the in-memory ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F15/76G06F5/06G06N3/04G06N3/063G06N3/08
CPCG06F15/76G06F5/065G06N3/063G06N3/08G06F2015/765G06F2015/766G06N3/045
Inventor 常亮李成龙赵鑫周军
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More