Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Variable latency stack cache and method for providing data

A memory and data technology, applied in static memory, memory system, digital memory information, etc., can solve the problems of short core clock cycle time, insufficient space, and reduced cache memory size.

Active Publication Date: 2005-06-29
IP FIRST
View PDF3 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although the reduction of the capacity of the cache memory will bring its corresponding problems, this approach is not because the size of the cache memory must be reduced due to the limited space that the microprocessor can accommodate; Because the core clock cycle time of the microprocessor is getting shorter and shorter, so that the size of the forced cache memory must be reduced

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Variable latency stack cache and method for providing data
  • Variable latency stack cache and method for providing data
  • Variable latency stack cache and method for providing data

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0066] Generally speaking, the program usually divides the system memory into two parts: a stack area and a non-stack area, and the present invention is proposed based on this fact. The non-stack area is also commonly referred to as the heap area. The main difference between a stack and a heap is that the stack is accessed in a random access manner; while the stack is generally operated in a last-in-first-out (LIFO) manner. Another point of difference between the two is the manner in which a read or write command is executed (or executed) to indicate a read or write address. In general, instructions that read or write to the stack typically specify memory addresses explicitly. Conversely, an instruction to read or write to the stack usually indicates its memory address indirectly through a special register in the microprocessor, wherein the special register is generally referred to as a stack pointer register. The push instruction updates the stack pointer register according...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention discloses a variable latency stack cache memory and a method for providing data. It includes a plurality of storage elements, and stores stack memory data in a last-in-first-out operation mode. This cache memory can distinguish between the request of fetch and download instructions, and operate by guessing that the fetch data may exist in the uppermost cache line of the cache memory; in addition, this cache memory will also store the stack data requested by the download instruction Make a guess and assume that its data will exist in the topmost cache rank of cache memory or in multiple cache ranks above it. Therefore, when the source virtual address of the download instruction hits the uppermost cache line of the cache memory, the speed of the cache memory to provide the download data will be faster than when the data is located in the lower cache line; or with a physical address Comes faster when comparing; or faster than when the data has to be provided from the microprocessor's off-stack cache memory.

Description

[0001] Statement of priority [0002] This case asserts that the United States claims priority based on an article entitled Microprocessor with Variable Latency Stack Cache (Microprocessor with Variable LatencyStack Cache), its application number is 10 / 759483 and its application date is 2004 January 16. [0003] Related application documents [0004] Application number Title 10 / 759559 Microprocessor and apparatus for performing fast Speculative pop operation from a stack memory 10 / 759564 Microprocessor and apparatus for performing speculative load operation from a stack memory 10 / 759489 Microprocessor and apparatus for performing fast pop operation from random access cache memory technical field [0005] The present invention relates to a cache memory in a microprocessor, in particular to a cache memory capable of distinguishing stack and non-stack memory access. Background technique [0...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/30G06F9/312G06F9/38G06F12/08G06F12/10
CPCG06F9/3004G06F9/30043G06F9/30134G06F9/383G06F9/3842G06F12/0875G06F12/1045
Inventor 罗尼·虎克
Owner IP FIRST
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products