Unlock instant, AI-driven research and patent intelligence for your innovation.

Cache memory system, and control method therefor

A memory system and high-speed cache technology, which is applied in the direction of memory systems, instruments, memory address/allocation/relocation, etc., to achieve the effect of improving high-speed buffering efficiency and avoiding excessive scale

Inactive Publication Date: 2012-02-22
PANASONIC CORP
View PDF1 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] However, according to the conventional technology, hardware monitors the status of access to the memory, and the hardware performs the cache operation autonomously. At this time, because it cannot be predicted correctly, unnecessary transmission occurs instead.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cache memory system, and control method therefor
  • Cache memory system, and control method therefor
  • Cache memory system, and control method therefor

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0084] Hereinafter, embodiments of the present invention will be described with reference to the drawings.

[0085]

[0086] figure 1 It is a block diagram showing the overall configuration of a computer system including a processor 1, a memory 2, a cache memory 3, and a TAC 4 according to an embodiment of the present invention. The cache memory 3 and TAC4 of this embodiment correspond to the cache memory system of this invention.

[0087] The TAC4 receives an instruction by executing a command specified in advance by the processor 1, the instruction shows the transmission of the cache data and the attribute operation and specifies the address of the operation object, and the TAC4 requests the cache memory 3 for the instruction shown operate.

[0088] The cache memory 3 performs cache storage of data in accordance with the processor 1's access to the memory, like a normal general cache memory. Also, when the processor 1 is not performing memory access processing, it execu...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Provided is a cache memory system for accepting a control from a software positively and processing the control. The cache memory system comprises a cache memory (3) interposed between a processor (1) and a memory (2), and a TAC (Transfer and Attribute Controller) (4) for controlling the cache memory. The TAC (4) is given, when the processor (1) executes a predetermined instruction, a command indicating the transfer and attribute operation of cache data and the address designating the object of the operation, and requests the cache memory for the operation indicated by the command for the address.

Description

technical field [0001] The present invention relates to a cache memory system and its control method, in particular to a technique for improving the controllability of software in a cache memory system. Background technique [0002] For microprocessors in recent years, for example, a small-capacity and high-speed cache memory including Static Random Access Memory (SRAM: Static Random Access Memory) is installed inside or near the microprocessor, and a part of the data Stored in the cache memory, so that the microprocessor can access the memory at a high speed. [0003] Conventional techniques for improving cache efficiency (increasing hit rate and reducing cache miss latency) are well known. One of its technologies is preloading (or prefetching), filling the cache with data that will be used in the near future before a cache miss occurs (for example, patent document 1). With this technique, cache misses are reduced by loading a line onto the cache that contains the address...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F12/12G06F12/08
CPCG06F12/12G06F12/0802G06F12/0893G06F12/00
Inventor 冈林叶月田中哲也中西龙太中岛雅逸金子圭介
Owner PANASONIC CORP