Check patentability & draft patents in minutes with Patsnap Eureka AI!

Methods and apparatus for low-complexity instruction prefetch system

An instruction prefetching and instruction technology, applied in the address formation of the next instruction, memory system, concurrent instruction execution, etc., can solve the problems of memory access bandwidth loss, increased power usage, low processor performance, etc.

Active Publication Date: 2009-09-30
QUALCOMM INC
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The conventional technique of always prefetching the next cache line fetches instructions that may not be used and thus causes unnecessary loss of memory access bandwidth, increased power usage and lower processor performance

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Methods and apparatus for low-complexity instruction prefetch system
  • Methods and apparatus for low-complexity instruction prefetch system
  • Methods and apparatus for low-complexity instruction prefetch system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0013] The detailed description, which is set forth below in conjunction with the accompanying drawings, is intended as a description of various exemplary embodiments of the invention and is not intended to represent the only embodiments in which the invention may be practiced. The detailed description includes specific details for the purpose of providing a thorough understanding of the invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced without these specific details. In some instances, well-known structures and components are shown in block diagram form in order to avoid obscuring the concepts of the invention.

[0014] figure 1 An exemplary wireless communication system 100 is described in which embodiments of the present invention may be advantageously employed. For illustrative purposes, figure 1 Three remote units 120, 130 and 150 and two base stations 140 are shown. It will be appreciated that a typical wir...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

When misses occur in an instruction cache, prefetching techniques are used that minimize miss rates, memory access bandwidth, and power use. One of the prefetching techniques operates when a miss occurs. A notification that a fetch address missed in an instruction cache is received. The fetch address that caused the miss is analyzed to determine an attribute of the fetch address and based on the attribute a line of instructions is prefetched. The attribute may indicate that the fetch address is a target address of a non-sequential operation. Another attribute may indicate that the fetch address is a target address of a non-sequential operation and the target address is more than X % into a cache line. A further attribute may indicate that the fetch address is an even address in the instruction cache. Such attributes may be combined to determine whether to prefetch.

Description

technical field [0001] The present invention relates generally to the field of instruction caches, and more specifically, to instruction prefetching when there is a miss in an instruction cache. Background technique [0002] Many portable products (eg, cell phones, laptops, personal data assistants (PDAs), etc.) utilize processors that execute programs (eg, communication and multimedia programs). Processing systems for such products include processor and memory complexes for storing instructions and data. Bulk main memory typically has slow access times compared to processor cycle times. Accordingly, memory complexes are conventionally organized in hierarchies based on the capacity and performance of the caches, with the highest performance and lowest capacity caches located closest to the processor. For example, a level 1 instruction cache and a level 1 data cache will typically be attached directly to the processor. The Level 2 Unified Cache is connected to the Level 1 ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F9/38G06F12/08
CPCG06F12/0862G06F9/3802G06F9/30G06F9/34G06F9/32G06F12/08
Inventor 迈克尔·威廉·莫罗詹姆斯·诺里斯·迪芬德尔弗尔
Owner QUALCOMM INC
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More