System-level cache

A system-level, caching technology that can be used in memory systems, climate sustainability, instrumentation, etc. to solve problems such as the inability of client devices to allocate cache lines to reduce latency, improve power efficiency, and improve power consumption

Pending Publication Date: 2021-06-01
GOOGLE LLC
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

So, for example, a cache policy may specify that requests from one or more client devices are always sto

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • System-level cache
  • System-level cache
  • System-level cache

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0123]Example 1 is a system, including:

[0124]Multiple integrated client devices;

[0125]Memory controller, the memory controller is configured to read data from the memory device;

[0126]System-level cache configured to cache data requests through the memory controller for each integrated client device in the plurality of integrated client devices,

[0127]The system-level cache includes a cache memory having a plurality of roads, and each of the plurality of roads is the main road or auxiliary path.

[0128]Each of these main routers are dedicated to a single corresponding partition corresponding to a memory buffer accessed by one or more client devices, and

[0129]Each of them corresponds to a plurality of partitions that correspond to multiple memory buffers accessible to the group of client devices, and

[0130]The system-level cache is configured to maintain mapping between the partition and the priority level, and configured to assign the main path to the corresponding enabled in order corre...

Example Embodiment

[0131]Embodiment 2 is a system of the first embodiment, wherein the system-level cache is configured to discharge the main path to a first partition accessed by one or more first client devices, and is configured to assign secondary way to The first partition and one or more other partitions allocated by a group that also include a client device of the first client device.

Example Embodiment

[0132]Embodiment 3 is a system of the second embodiment, wherein the system-level cache is configured to maintain a mapping between the group and the secondary priority level of the client device, and is configured to follow the group assigned to the client device. The corresponding secondary priority level corresponds to the auxiliary path to the corresponding enable partition.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

Methods, systems, and apparatus, including computer programs encoded on computer storage media, for a system-level cache to allocate cache resources by a way-partitioning process. One of the methods includes maintaining a mapping between partitions and priority levels and allocating primary ways to respective enabled partitions in an order corresponding to the respective priority levels assigned to the enabled partitions.

Description

Background technique [0001] This specification relates to systems having integrated circuit devices. [0002] A system level cache (SLC) is a device that caches data retrieved from memory or data to be stored to the memory of a number of different hardware devices in the system. In other words, different cache lines of the SLC can store data belonging to different hardware devices. [0003] Typically, multiple different hardware devices are different components integrated into a system on chip (SOC). In this specification, a device that provides read requests and write requests through the SLC will be referred to as a client device. [0004] Caching can be used to reduce power consumption by reducing main memory usage. In other words, the main memory and the path to the main memory can be placed in a low power state as long as the client devices can access the data they need in the cache. [0005] Caches are usually organized as collections with multiple ways. The request...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F12/084G06F12/0846
CPCG06F12/084G06F12/0848G06F13/1694Y02D10/00G06F12/0811G06F12/0877G06F12/0815
Inventor 维诺德·沙马蒂马啸宇尹洪一基思·罗伯特·普夫勒德雷尔廖维平本杰明·道奇阿尔伯特·迈克斯纳艾伦·道格拉斯·克尼斯马努·古拉蒂拉胡尔·贾格迪什·塔库尔杰森·鲁珀特·莱德格雷夫
Owner GOOGLE LLC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products