Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A performance improvement method for stacking dram cache

A caching and performance technology, applied in the performance improvement field of stacked DRAM cache, can solve problems such as high latency, high energy consumption, and cache misses, and achieve the effects of reducing line misses, improving performance, and improving data hit rate

Active Publication Date: 2022-05-06
ZHEJIANG GONGSHANG UNIVERSITY
View PDF1 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

If the target memory line is not in the line buffer, a high-energy, high-latency line miss is caused; in addition, even if the target memory line is in the line buffer, a cache miss may occur, resulting in additional delay and energy consumption

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A performance improvement method for stacking dram cache
  • A performance improvement method for stacking dram cache
  • A performance improvement method for stacking dram cache

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] The present invention will be further described in detail below in conjunction with the accompanying drawings and examples. The following examples are explanations of the present invention and the present invention is not limited to the following examples.

[0031] The method for improving the performance of the stacked DRAM cache in this embodiment includes the following steps:

[0032] S1. Propose a row buffer manager, which includes a row status table, and the row status table includes a plurality of row status entries, and each row status entry includes an activation bit, a memory block number, a memory row number, a tag value sequence, a tag value filling bit, The number of waiting requests and the last access bit are used to describe the state of a memory row data.

[0033] The active bit identifies whether the memory line is loaded into the line buffer: if the active bit value is 1, it means that the memory line is currently loaded into the line buffer; if the ac...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a method for improving the performance of a stacked DRAM cache, and proposes a line buffer manager to help shorten the access delay of the stacked DRAM cache and accelerate data access, thereby improving performance. The present invention includes the following steps: S1, propose a line buffer manager, which includes a line state table, the line state table includes a plurality of line state entries, and each line state entry includes an activation bit, a tag value sequence, a tag value filling bit and a waiting request In order to describe the state of a memory row data; S2, the row buffer manager is connected with the data access request queue, and the row buffer manager updates the information of the row status table according to the data access request arriving in the data access request queue; S3, The row buffer manager is connected to the cache controller, and the row buffer manager receives the command from the cache controller and updates the information of the row status table; S4, the row buffer sends the stacked DRAM buffer through the cache controller to the stacked DRAM cache according to the information of the row status table. control commands.

Description

technical field [0001] The invention relates to a method for improving the performance of a stacked DRAM cache. Background technique [0002] The memory wall problem in the context of big data processing exacerbates the problem of data transfer between on-chip processors and off-chip memory. Integrated on-chip stacked DRAM (3D DRAM) memory is an effective way to meet this challenge, and its advantages lie in high bandwidth and low power consumption. Therefore, on-chip stacked DRAM is used as the last level of on-chip cache to temporarily store data from off-chip memory, reduce the amount of data transmission on and off the chip, significantly reduce data transmission delay, and effectively improve system performance. On the other hand, the organizational structure and interface design of on-chip stacked DRAM are not friendly to cache access, which hinders further improvement of system performance and affects its performance as the last level of cache. [0003] The on-chip ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F3/06
CPCG06F3/061G06F3/0656G06F3/0659
Inventor 章铁飞柴春来
Owner ZHEJIANG GONGSHANG UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products