Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Reducing power consumption of cache

A high-speed cache and high-speed cache line technology, applied in the field of memory systems, can solve the problem of consuming considerable power and achieve the effect of reducing power consumption

Inactive Publication Date: 2007-02-07
FUJITSU LTD
View PDF0 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] The cache memory on the processor usually consumes a considerable amount of power

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Reducing power consumption of cache
  • Reducing power consumption of cache
  • Reducing power consumption of cache

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0010] figure 1 An example non-uniform cache architecture for reducing power consumption across cache 10 is illustrated. In a particular embodiment, cache 10 is a component of a processor that temporarily stores code for execution on the processor. References to "code" include one or more executable instructions, other code, or both, where appropriate. Cache 10 includes a number of sets 12 , a number of ways 14 and a number of tags 16 . Group 12 logically intersects multiple ways 14 and multiple tags 16 . A logical crossing between a set 12 and a way 14 includes a plurality of adjacent memory locations in the cache 10 for storing code. A logical intersection between set 12 and tag 16 includes one or more memory locations in cache 10 that are adjacent to each other for storing codes stored in cache 10, codes stored in The code in the cache 10 identifies, or data for locating and identifying the code stored in the cache 10 . By way of example and not limitation, the first l...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

In one embodiment, a method for reducing power consumption at a cache includes determining a code placement according to which code is writable to a memory separate from a cache. The code placement reduces occurrences of inter cache-line sequential flows when the code is loaded from the memory to the cache. The method also includes compiling the code according to the code placement and writing the code to the memory for subsequent loading from the memory to the cache according to the code placement to reduce power consumption at the cache. In another embodiment, the method also includes determining a nonuniform architecture for the cache providing an optimum number of cache ways for each cache set in the cache. The nonuniform architecture allows cache sets in the cache to have associativity values that differ from each other. The method also includes implementing the nonuniform architecture in the cache to further reduce power consumption at the cache.

Description

technical field [0001] The present invention relates generally to a memory system and, more particularly, to reducing power consumption on cache memory. Background technique [0002] Cache memory on a processor typically consumes a considerable amount of power. As an example, the instruction cache on an ARM920T processor consumes about 25% of the processor's power consumption. As another example, the instruction cache on a StrongARM SA-110 processor (which is targeted for low power applications) consumes about 27% of the processor's power consumption. Contents of the invention [0003] Embodiments of the present invention reduce or eliminate problems and disadvantages associated with existing memory systems. [0004] In one embodiment, a method for reducing power consumption of a cache includes the step of determining code placement based on which code can be written to memory separate from the cache. This code placement reduces the occurrence of inter-cache-line sequen...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F1/32G06F12/08
CPCY02B60/1225G06F1/3203G06F1/3275G06F2212/271Y02D10/00
Inventor 石原亨法尔扎·法拉赫
Owner FUJITSU LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products