Unlock instant, AI-driven research and patent intelligence for your innovation.

Multi-class data cache policies

A data cache, cache technology, applied in electrical digital data processing, memory systems, memory address/allocation/relocation, etc., to achieve the effect of reducing the number of cache misses

Active Publication Date: 2010-06-16
NVIDIA CORP
View PDF0 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, in certain systems where the usage pattern of the data changes, such an approach may not be sufficient to evict data quickly enough to make room for future read and write operations and to allow data to remain in the cache long enough to Appropriate balance between reusing and avoiding requesting data from external memory

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-class data cache policies
  • Multi-class data cache policies
  • Multi-class data cache policies

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0015] In the following description, numerous specific details are given in order to provide a more thorough understanding of the present invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced without one or more of these specific details. In other instances, well-known features have not been described in order to avoid obscuring the present invention.

[0016] System Overview

[0017] figure 1 is a block diagram illustrating a computer system 100 configured to implement one or more aspects of the present invention. Computer system 100 includes central processing unit (CPU) 102 and system memory 104 that communicate via a bus path through memory bridge 105 . The memory bridge 105 can be as figure 1 Integrated into CPU 102 as shown. Alternatively, memory bridge 105 may be a conventional device such as a north bridge chip connected to CPU 102 via a bus. The memory bridge 105 is connected to an I / O (input / output) bridg...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention provides multi-class data cache policies. An embodiment of the invention clarifies a mechanism for expelling out data from data cache based on the class of data. Based on the data class which reflects the reusing probability, the data stored in the cache lines in the data cache are classified. The data class is stored in a tag memory, wherein each tag in the tag memory is corresponding with a single cache line in the data cache. When the data related with command is stored in the cache, a tag checking unit checks the data class in the tag memory for determining which data are expelled. The data with lower reusing probability is ejected out with a high priority compared with the data with higher reusing probability. Advantageously, the ejection of the data of data class with lower reusing probability reduces the number of cache loss in the system.

Description

technical field [0001] The present invention relates generally to the field of memory management, and more particularly, to multi-class data caching strategies. Background technique [0002] One element of the memory subsystem within a certain processing unit is Level 2 cache memory (referred to herein as "L2 cache"). The L2 cache is a large on-chip memory that serves as an intermediate point between external memory (eg, framebuffer memory) and the memory subsystem's internal clients (referred to herein as "clients"). The L2 cache temporarily stores data being used by various clients. This data can be retrieved from or written to external memory (referred to herein as "DRAM"). The client can reuse the data stored in the L2 cache while performing certain operations. [0003] During a read operation, the client may request data from the L2 cache that is not currently stored in the L2 cache, so this data must be obtained from DRAM. A read operation that must obtain data fro...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F12/08
CPCG06F12/126G06F12/128G06F12/08G06F9/06
Inventor 戴维·B.·格拉斯科彼得·B.·赫姆奎斯特乔治·R.·林奇帕特里克·R.·马尔尚詹姆斯·罗伯茨
Owner NVIDIA CORP