Unlock instant, AI-driven research and patent intelligence for your innovation.

Providing scalable dynamic random access memory (DRAM) cache management using tag directory caches

A dynamic random access and memory management technology, applied in memory systems, instruments, electrical digital data processing, etc.

Inactive Publication Date: 2018-08-28
QUALCOMM INC
View PDF6 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

While the latter practice minimizes SRAM usage, any incorrect predictions will result in data being read from system memory DRAM
Reads to system memory DRAM incur additional access latency, which may negate any performance improvement from using DRAM cache
Still other practices may require very large data structures stored in system memory DRAM in order to keep track of cached data

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Providing scalable dynamic random access memory (DRAM) cache management using tag directory caches
  • Providing scalable dynamic random access memory (DRAM) cache management using tag directory caches
  • Providing scalable dynamic random access memory (DRAM) cache management using tag directory caches

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0020] Referring now to the drawings, several exemplary aspects of the disclosure are described. The word "exemplary" is used herein to mean "serving as an example, instance, or illustration." Any aspect described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other aspects.

[0021] Aspects disclosed in the detailed description include providing scalable dynamic random access memory (DRAM) cache management using a tag directory cache. As described herein, the DRAM cache management scheme is "scalable" in the sense that the size of the resources utilized by the DRAM cache management scheme is relatively independent of the capacity of the DRAM cache being managed. Therefore, in this regard, figure 1 is a block diagram of an exemplary processor-based system 100 that provides a DRAM cache management circuit 102 for managing a DRAM cache 104 and an associated tag directory 106 of the DRAM cache 104 , both of which are part of the hig...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Providing scalable dynamic random access memory (DRAM) cache management using tag directory caches is provided. In one aspect, a DRAM cache management circuit is provided to manage access to a DRAM cache in a high-bandwidth memory. The DRAM cache management circuit comprises a tag directory cache and a tag directory cache directory. The tag directory cache stores tags of frequently accessed cachelines in the DRAM cache, while the tag directory cache directory stores tags for the tag directory cache. The DRAM cache management circuit uses the tag directory cache and the tag directory cache directory to determine whether data associated with a memory address is cached in the DRAM cache of the high-bandwidth memory. Based on the tag directory cache and the tag directory cache directory, theDRAM cache management circuit may determine whether a memory operation can be performed using the DRAM cache and / or a system memory DRAM.

Description

[0001] priority claim [0002] This application requires filing on January 21, 2016 and is entitled "PROVIDING SCALABLE DYNAMICRANDOM ACCESS MEMORY (DRAM) CACHE MANAGEMENT USING TAG DIRECTORY CACHES), the contents of which are incorporated herein by reference in their entirety. [0003] This application is also required to be filed on June 24, 2016 and is entitled "PROVIDING SCALABLE DYNAMICRANDOM ACCESS MEMORY (DRAM) CACHE MANAGEMENT USING TAG DIRECTORY CACHES), the contents of which are incorporated herein by reference in their entirety. technical field [0004] The techniques of this disclosure relate generally to dynamic random access memory (DRAM) management, and in particular, to management of DRAM cache memory. Background technique [0005] The advent of die-stacked integrated circuits (ICs) consisting of multiple stacked dies interconnected vertically has enabled the creation of die-stacked dynamic random access memories (DRAMs). Die-stacked DRAM can be used to i...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F12/0895
CPCG06F12/121G06F2212/1024G06F2212/1048G06F2212/502G06F12/0895G06F2212/1016G06F2212/305
Inventor H·M·勒T·Q·张N·瓦伊德亚纳坦M·C·A·A·黑德斯C·B·韦里利
Owner QUALCOMM INC