Novel context instruction cache architecture for a digital signal processor

a digital signal processor and context instruction technology, applied in the direction of memory address/allocation/relocation, instruments, sustainable buildings, etc., can solve the problems of high interrupt rate, high interrupt rate, and high CPU clock clock, and achieve the effect of reducing cache thrashing

Inactive Publication Date: 2008-07-17
ANALOG DEVICES INC
View PDF3 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

[0007]According to an aspect of the subject matter, there is provided a method for reducing cache thrashing in a DSP, comprising the steps of dynamically enabling caching of instructions upon encountering current frequently executed instructions in a program, and dynamically disabling the caching of the instructions upon encountering an exit point associated with the frequently executed instructions.

Problems solved by technology

DSP applications are characterized by real-time operation, high interrupt rates, and intensive numeric computations.
In addition, DSP applications tend to be intensive in memory access operations and to require the input and output of large quantities of data.
Thus, designs of DSPs may be quite different from those of general purpose processors.
When instructions and data are stored in the program memory, conflicts may arise in the fetching of instructions.
Further, in the case of Harvard architecture, the instruction fetch and the data access can take place in the same clock cycle, which can lead to a conflict on the program memory bus.
In this scenario, instructions which can generally be fetched in a single clock cycle for a case can stall a cycle due to conflict.
Most of the cache architectures suffer from performance degradation due to cache thrashing, i.e., loading the cache with instruction and then removing it while it is still needed before it can be used by the computer system.
Cache thrashing is, of course, undesirable, as it reduces the performance gains.
However, these techniques come with overheads like extra hardware, increased cache hit access time, and / or higher software overhead.
However, this technique requires additional overheads in terms of requiring profiling of code by user and extra instructions in the code to lock the cache.
Further, this can make the code very cumbersome.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Novel context instruction cache architecture for a digital signal processor
  • Novel context instruction cache architecture for a digital signal processor
  • Novel context instruction cache architecture for a digital signal processor

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0014]In the following detailed description of the various embodiments of the invention, reference is made to the accompanying drawings that form a part hereof, and in which are shown by way of illustration specific embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention, and it is to be understood that other embodiments may be utilized and that changes may be made without departing from the scope of the present invention. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present invention is defined only by the appended claims.

[0015]The terms “cache”, “cache memory”, “instruction cache memory”, “conflict cache memory” are used interchangeably throughout the document. Also, the terms “thrashing” and “cache thrashing” are used interchangeably throughout the document. In addition, the terms “code”, “instructions”, and ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

Improved thrashing aware and self configuring cache architectures that reduce cache thrashing without increasing cache size or degrading cache hit access time, for a DSP. In one example embodiment, that is accomplished by selectively caching only the instructions having a higher probability of recurrence to considerably reduce cache thrashing.

Description

TECHNICAL FIELD OF THE INVENTION[0001]The present invention relates to digital signal processors, and more particularly to real-time memory management for digital signal processors.BACKGROUND OF THE INVENTION[0002]A digital signal computer or digital signal processor (DSP) is a special purpose computer that is designed to optimize performance for digital signal processing applications, such as, for example, fast Fourier transforms, digital filters, image processing and speech recognition. DSP applications are characterized by real-time operation, high interrupt rates, and intensive numeric computations. In addition, DSP applications tend to be intensive in memory access operations and to require the input and output of large quantities of data. Thus, designs of DSPs may be quite different from those of general purpose processors.[0003]One approach that has been used in the architecture of DSPs is the Harvard architecture, which utilizes separate, independent program and data memorie...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(United States)
IPC IPC(8): G06F12/08
CPCG06F9/381Y02B60/1225G06F12/0888Y02D10/00
Inventor RINGE, TUSHAR PRAKASHGIRI, ABHIJIT
Owner ANALOG DEVICES INC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products