Method and system for determining cache set replacement order based on temporal set recording

A technique of caching and replacing order, applied in the field of data processing

Active Publication Date: 2013-07-24
IBM CORP
View PDF6 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Otherwise, if the data is not contained in the cache, it is a cache miss and the data must be retrieved from other storage media not necessarily close to the requester, so it is relatively slow

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and system for determining cache set replacement order based on temporal set recording
  • Method and system for determining cache set replacement order based on temporal set recording
  • Method and system for determining cache set replacement order based on temporal set recording

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0014] Microprocessors may contain a Level 1 (L1) data cache (D-cache). The L1 data cache is used to hold data elements for a subset of system memory locations so that instructions performing loads and stores closest to the processor core can be processed. On a cache miss, the data element corresponding to the requested storage location is installed into the cache. Each entry in the cache represents a cache line that corresponds to a portion of memory. One such typical mounting algorithm for caching is based on least recently used (LRU). As an example, the data cache may contain 1024 rows, which are called congruence classes, and the data cache may be a 4-way set associative cache, which has a total of 4k entries. For each congruence class, the ordering of the most recently used group to the least recently used group can be tracked as a set of hierarchical positions for replacement. When installing a new entry into the data cache for a given congruence class in one of the 4...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The present invention relates to a method and system for determining cache set replacement order based on temporal set recording. A technique is provided for cache management of a cache. A processing circuit determines a miss count and a hit position field during a previous execution of an instruction requesting that a data element is stored in a cache. The miss count and the hit position field are stored for a data element corresponding to an instruction that requests storage of the data element. The processing circuit places the data element in a hierarchical order based on the miss count and/or the hit position field. The hit position field includes a hierarchical position related to the data element in the cache.

Description

technical field [0001] The present invention relates to data processing, and more particularly to cache set replacement order of data elements in a set associative cache based on time record information. Background technique [0002] A cache is a component that transparently holds data elements (or simply data) so that future requests for any held data can be served more quickly. The data elements stored in the cache correspond to predefined storage locations in the computer system. Such data elements may be recently calculated values, or duplicate copies of the same storage unit that is also stored elsewhere. If the requested data is contained in the cache, this is a cache hit, and the request can be serviced simply by reading from the cache, which is relatively fast since caches are usually built close to their requesters. Otherwise, if the data is not contained in the cache, this is a cache miss, and the data must be retrieved from other storage media not necessarily cl...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F12/12
CPCG06F12/0875G06F12/08G06F12/126G06F12/12G06F12/121
Inventor F·Y·布萨巴S·R·卡罗格C·A·克里高夫斯基B·R·普拉斯克C-L·K·舒姆
Owner IBM CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products