Data reading/writing method and device and computer system on basis of multi-level Cache

A data reading and caching technology, applied in the memory system, data processing input/output process, and computing, etc., can solve the problem of low cache access efficiency, reduce the cache miss rate, improve the cache access efficiency, and enhance the cache hit rate Effect

Active Publication Date: 2015-02-11
HUAWEI TECH CO LTD +1
View PDF3 Cites 27 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In the above process, it usually needs to go through multiple lookups before it can be hit. For example, if the data targeted by the read operation is stored in

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Data reading/writing method and device and computer system on basis of multi-level Cache
  • Data reading/writing method and device and computer system on basis of multi-level Cache
  • Data reading/writing method and device and computer system on basis of multi-level Cache

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0108] The following will provide a specific implementation solution for the data reading method based on the multi-level cache according to the above computer system.

[0109] In the multi-level cache-based data reading method provided by this embodiment, the attribute information of the memory page further includes cache location attribute information of the physical memory page, and the cache location attribute information is set according to the access of the physical memory page.

[0110] Such as image 3 As shown, the data reading method based on multi-level cache specifically includes the following steps:

[0111] Step 1. Set the cache location attribute of the memory page.

[0112] In this specific embodiment, through the accessed situation of the physical memory page, some bits are added in the page table to identify the cache access level (CAL, Cache Access Level) information, and the CAL information is used to identify that the memory page can enter The highest ac...

Embodiment 2

[0124] For the L2Cache miss in step ④ in the first embodiment, another embodiment is provided. In this embodiment, the first 4 steps may refer to the first 4 steps in Embodiment 1, which will not be repeated here.

[0125] If the L2Cache misses in Step 4 of Embodiment 1, such as Figure 4 As shown, this embodiment also includes:

[0126] In step ⑤, the multi-level cache-based data read / write device in the CPU Core uses the high-order bit PA_3 of the physical address to query whether the next-level L3Cache is hit.

[0127] The device for reading / writing data based on the multi-level cache in the CPU Core compares the high-order bit PA_3 of the physical address with the Tag in the L3Cache, and obtains a result of whether the L3Cache hits.

[0128] If the L3Cache hits, proceed to step ⑥; if the L3Cache misses, proceed to step ⑦.

[0129] Step 6. If the L3Cache hits, the data to be read is transmitted to the multi-level cache-based data read / write device in the CPU Core through...

Embodiment 3

[0133] The following will provide a specific implementation solution for the data writing method based on the multi-level cache based on the above computer system.

[0134] In this embodiment, the same memory page cache location property setting as in Embodiment 1, that is, the same page table is used. Therefore, in this embodiment, when the multi-level cache-based data read / write device in the CPU Core initiates a data write request, the first 4 steps can refer to the first 4 steps in Embodiment 1, the only difference is The request for reading data is changed to the request for writing data, and others are not described here. If the L2Cache is hit by step ④, such as Figure 5 As shown, this embodiment also includes:

[0135] Step ⑤: Write data into L2Cache.

[0136] Further, if the data written in step ⑤ is shared data, the following steps can also be performed in order to maintain the consistency of the shared data.

[0137] Step ⑥: The multi-level cache-based data read...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

A multi-level cache-based data reading/writing method and device, and a computer system, which relate to the data reading/writing field of computer systems, and are used to improve the Cache access efficiency during data reading/writing. The method comprises: acquiring a first query address of a first physical memory data block for reading/writing data; acquiring a first cache position attribute of the first physical memory data block; according to the first query address, in accordance with a high to low order of cache levels which the first physical memory data block indicated by the first cache position attribute can access, querying whether the caches are hit in sequence or not, until one cache is hit or none of the caches is hit; if one cache is hit, reading/writing data for the first query address of the first physical memory data block in the hit cache; alternatively, if none of the caches is hit, reading/writing data for a first query address of a first physical memory data block in the memory.

Description

technical field [0001] The invention relates to the field of data reading and writing of computer systems, in particular to a multi-level cache-based data reading / writing method, device and computer system. Background technique [0002] Cache (Cache, also known as cache memory) is located between the central processing unit (CPU, Central Processing Unit) and memory (Memory), mainly to solve the problem of mismatch between CPU computing speed and memory read and write speed. At present, the computer adopts a three-level Cache structure at most, among which L1 (Level1) adopts a high-speed and small-capacity Cache, L2 (Level2) uses a medium-speed and large-capacity Cache, and L3 (Level3) uses a low-speed and large-capacity Cache. [0003] Based on the above-mentioned multi-level Cache, the specific process of the CPU performing the read operation in the prior art is as follows: [0004] When the CPU Core (CPU core) sends a read data request, it searches for data layer by layer...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F12/08G06F12/0811G06F12/0897G06F12/1027
CPCG06F12/08G06F12/0897G06F12/1027G06F2212/1016G06F3/0604G06F3/064G06F3/0683G06F12/0811G06F2212/50G06F2212/60
Inventor 李晔张立新侯锐张科
Owner HUAWEI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products