Unlock instant, AI-driven research and patent intelligence for your innovation.

Cache system with primary cache and overflow cache that use different indexing schemes

A cache, memory system technology, applied in memory systems, instruments, memory architecture access/allocation, etc., can solve problems such as slow delay and expensive physical system memory

Active Publication Date: 2016-07-27
VIA ALLIANCE SEMICON CO LTD
View PDF8 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] The latency of physical system memory accesses is relatively slow, making table lookups a relatively expensive operation since it involves potentially multiple accesses to physical memory

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cache system with primary cache and overflow cache that use different indexing schemes
  • Cache system with primary cache and overflow cache that use different indexing schemes
  • Cache system with primary cache and overflow cache that use different indexing schemes

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0022] It is desirable to reduce the size of the L1TLB cache array without substantially affecting performance. The inventors have recognized the inefficiencies associated with traditional L1TLB structures. For example, most applications are not coded to maximize L1TLB utilization, often leaving some groups overutilized and others underutilized.

[0023] Therefore, the inventors developed a cache system with main cache and overflow cache using different indexing schemes to improve cache utilization. The cache storage system includes an overflow cache (or L1.5 cache), wherein the overflow cache serves as an extension of the main cache array (or L1.0 cache) during cache searches, and Also used as an eviction array for L1.0 caches. In addition, the combined cache structure achieves the same performance while being substantially smaller than the traditional L1 cache structure. An overflow cache array or L1.5 cache differs from a normal eviction array (such as an L2 cache, etc.)...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A cache memory system includes a primary cache and an overflow cache that are searched together using a search address. The overflow cache operates as an eviction array for the primary cache. The primary cache is addressed using bits of the search address, and the overflow cache is addressed by a hash index generated by a hash function applied to bits of the search address. The hash function operates to distribute victims evicted from the primary cache to different sets of the overflow cache to improve overall cache utilization. A hash generator may be included to perform the hash function. A hash table may be included to store hash indexes of valid entries in the primary cache. The cache memory system may be used to implement a translation lookaside buffer for a microprocessor.

Description

[0001] CROSS-REFERENCE TO RELATED APPLICATIONS [0002] This application claims priority to US Provisional Application Serial No. 62 / 024,020, filed July 14, 2014, which is hereby incorporated by reference in its entirety for all purposes and uses. technical field [0003] The present invention relates generally to microprocessor cache systems, and more particularly to cache systems having main caches and overflow caches that use different indexing schemes. Background technique [0004] Modern microprocessors include a memory cache system to reduce memory access latency and improve overall performance. System memory is external to the microprocessor and is accessed via a system bus or the like, making system memory access relatively slow. In general, a cache is a smaller and faster local memory component used to transparently store data retrieved from system memory based on previous requests so that future requests for the same data can be retrieved more quickly. The cache ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F12/0811G06F12/0864G06F12/1027G06F12/1045
CPCG06F12/0864G06F12/1027G06F12/1045G06F12/0811G06F12/0808G06F12/123G06F2212/1021G06F2212/683
Inventor 柯林·艾迪罗德尼·E·虎克
Owner VIA ALLIANCE SEMICON CO LTD
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More