Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Cache memory, memory system and control method therefor

A memory system and cache technology, applied in the field of memory systems, can solve the problems of complicated interfaces and increased area of ​​the DMAC 105 and the cache memory 102, and achieve the effects of suppressing the complexity of the interface, reducing the area, and suppressing the complexity.

Inactive Publication Date: 2011-08-24
PANASONIC CORP
View PDF4 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0022] but, Figure 15 The shown memory system 110 is required to have the bus 106, so, with Figure 14 Compared with the memory system 100 shown, there is a problem of increased area
This problem is particularly prominent when the memory system 110 contains multiple master devices such as DMAC105
[0023] In addition, when a plurality of bus protocols are used as the protocol used for data transfer between the DMAC 105 and the memory 104, there is a problem that the interface between the DMAC 105 and the cache memory 102 becomes complicated.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cache memory, memory system and control method therefor
  • Cache memory, memory system and control method therefor
  • Cache memory, memory system and control method therefor

Examples

Experimental program
Comparison scheme
Effect test

Embodiment approach 1

[0085] The cache memory according to the embodiment of the present invention outputs a read command to the main memory after the hit data is written back to the main memory when a read access from a host such as a DMAC hits. Also, when the write access from the master has hit, the hit data is invalidated, and a write command is output to the main memory.

[0086] Accordingly, even if a processor such as a CPU or a host does not perform flush processing, it is possible to maintain consistency between the cache memory and the main memory. Thus, the cache memory according to the embodiment of the present invention can suppress the decrease in the processing performance of the processor that occurs in order to maintain coherency between the cache memory and the main memory.

[0087] In addition, read data and write data are directly transferred between the host device and the main memory without passing through the cache memory. Thereby, there is no need to form a bus for transfe...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Disclosed is an L2 cache (202) provided with a first port (211) for inputting commands from a CPU (201); a third port (213) for inputting commands from a DMAC (205); a bit decision unit (71) in which it is determined whether or not the data of an address specified in the command is stored in the L2 cache (202) when a command is input to the third port (213); and a DMAC access controller (63) in which processing is conducted to maintain the coherency of the data stored in a memory (204) and the stored data when a command is input to the third port (213) and the decision by the bit decision unit (71) is that the command is stored, and the input command is output to a memory (204) as the command output from the DMAC (205).

Description

technical field [0001] The present invention relates to a cache memory, a memory system, and a control method thereof, in particular, a cache memory that stores a part of data stored in a main memory in response to an access from a processor, and a memory system including the cache memory. Background technique [0002] In recent memory systems, for example, a small-capacity and high-speed cache memory composed of SRAM (Static Random Access Memory, Static Random Access Memory) or the like is disposed inside or near a microprocessor. In such a memory system, the memory access of the microprocessor is made high-speed by storing part of the data read by the microprocessor from the main memory and part of the data written into the main memory in a cache memory (cache). change. [0003] Figure 14 is a diagram showing the configuration of a conventional memory system 100 . Figure 14 The shown memory system 100 has a CPU 101 , a cache memory 102 , a memory controller 103 , a memo...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F12/08G06F12/0815G06F12/0844
CPCG06F12/0815G06F12/0844
Inventor 礒野贵亘
Owner PANASONIC CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products