Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Appts. for memory communication during runhead execution

A technology of memory and look-ahead instructions, which is applied in the field of processor architecture and can solve problems such as unavailability

Inactive Publication Date: 2007-04-11
INTEL CORP
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The main disadvantage of these techniques is that they require idle threading environments and shared resources (eg, access and execution bandwidth), which are usually not available when the processor is well used
However, it still requires a large instruction window (and a large physical register file), and its associated cost

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Appts. for memory communication during runhead execution
  • Appts. for memory communication during runhead execution
  • Appts. for memory communication during runhead execution

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] According to one embodiment of the present invention, in order to tolerate operations with very long latency, lookahead execution may be used instead of building a larger instruction window. Instead of making the long-latency operation "non-blocking" (which requires buffering it and the instructions following it into the instruction window), lookahead execution on an out-of-order processor can simply drop it out of the instruction window .

[0031] According to one embodiment of the present invention, the state of an architectural register file can be checked point-by-point when an instruction window is blocked by a long-latency operation. The processor may then enter "lookahead mode" and may issue a false (ie, invalid) result for the blocking operation and may discard the blocking operation out of the instruction window. The instruction following the blocking operation can then be fetched, executed, and pseudo-retired from the instruction window. "Pseudo-retirement" ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Processor architectures, and in particular, processor architectures with a cache-like structure to enable memory communication during runahead execution. In accordance with an embodiment of the present invention, a system including a memory; and an out-of-order processor coupled to the memory. The out-of-order processor including at least one execution unit, at least one cache coupled to the at least one execution unit; at least one address source coupled to the at least one cache; and a runahead cache coupled to the at least one address source.

Description

technical field [0001] The present invention relates to processor architectures, and in particular, to processor architectures having a cache-like structure such that memory communication can occur during lookahead execution. Background technique [0002] Today's high-performance processors tolerate operations with high latency by implementing out-of-order instruction execution. An out-of-order execution machine tolerates long latencies by allowing long-latency operations to "unblock" operations that are later in the instruction stream and unrelated to them. To achieve this, the processor buffers operations into an instruction window whose size determines the amount of latency an out-of-order machine can tolerate. [0003] Unfortunately, today's processors are facing increasingly higher latencies due to the ever-growing gap between processor and memory speeds. For example, an operation that causes a cache miss in main memory may take hundreds of processor cycles to complet...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F12/00G06F12/08
CPCG06F12/0897G06F12/0875
Inventor J·W·斯塔克C·B·维尔克森O·穆特卢
Owner INTEL CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products