Cache management method and device

A cache management and cache technology, applied in the computer field, can solve problems such as cache pollution

Active Publication Date: 2014-10-01
HUAWEI TECH CO LTD +1
View PDF4 Cites 21 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] The embodiment of the present invention provides a cache management method and device, which can at least effectively solve

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cache management method and device
  • Cache management method and device
  • Cache management method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0110] An embodiment of the present invention provides a cache management method. The cache is divided into two parts: a solid cache (Solid Cache) and a virtual cache (Phantom Cache), such as image 3 As shown in the schematic diagram of the cache, the physical cache is maintained by the linked list L1, and the virtual cache is maintained by the L2. The metadata and data of the page are stored in the physical cache, and only the metadata is stored in the virtual cache. What needs to be understood is: because only metadata is stored in the virtual cache, and the metadata only saves the access information of the page, therefore, if the requested page hits the linked list L2, it is not a real cache hit.

[0111] In the embodiment of the present invention, the entity cache linked list L1 can be divided into more than one segment. Preferably, the linked list L1 is divided into 4 segments, and the number of pages stored in each segment of the linked list can be different (for the co...

Embodiment 2

[0125] The embodiment of the present invention provides a cache management method. The method provided in this embodiment is similar to the method provided in the first embodiment above, and has the same strategy for how to delete pages in the entity cache, that is, the method provided in the first embodiment above scheme, so that the data in the page that finally meets the condition is deleted. In this embodiment, based on the strategy provided in the first embodiment above, a solution is designed to add a new requested page using the space provided by the deleted page. For details see Figure 5 As shown, the entity cache is managed and maintained through the linked list L1, and the linked list L1 is at least divided into more than one section, and the division of each section is fixed, indicating that each section has a certain storage space, but when there is a replacement candidate page that needs to be added, Requirements for segmented channels need to be met; the approa...

Embodiment 3

[0147] The embodiment of the present invention provides a cache management method. This method is based on the same inventive concept as the first and second embodiments above. The difference is that the first and second embodiments maintain the entity cache through a linked list L1. In this In the third embodiment, the entity cache is maintained through multiple links, and the number of linked lists can be the same as the number of segments of the linked list described in the above-mentioned embodiment. An example is to divide the linked list L1 into 4 segments. The number of times linked list is only an example for easy understanding, and is not a limitation to the embodiment of the present invention.

[0148] Such as Figure 6 The cache diagram shown in the cache shows four linked lists L1 to L4 for the physical cache and the linked list L0 for the virtual cache. The cache in the dotted line can be understood as a virtual cache. The virtual cache can be used as a preferr...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The embodiment of the invention discloses a cache management method and a cache management device. Through the technical scheme, when data is about to be replaced, if access exists in a cache period (i.e., the number of accessing times is greater than 0), the data is added into a cache again; data with different numbers of accessing times are added to different positions; data with many numbers of accessing times are added to difficult-to-replaced positions; and the data accessing frequency is more considered so that more frequently accessed data are more difficult to be replaced. In addition, by aiming at the same total accessing quantity, data with long-term uniform accessing can more permanently stay in the cache than data with the short-term focused accessing, so the data more suitable for an accessing mode of the cache is remained in the cache. Further, when data stored in an adjusted page returns to the cache again, the number of accessing times is reset, so the number of accessing times in a certain time only generates once gains, and the retention of data in the cache caused by accumulated accessing quantity of the data due to mass accessing in a short time is prevented.

Description

technical field [0001] The invention relates to the field of computers, in particular to a cache management method and device. Background technique [0002] In the computer field, caching (Caching) is a basic research. The cache mainly stores data through different levels of storage media. For example, a cache medium with a relatively high storage speed but a small storage capacity is placed before a storage medium with a relatively slow storage speed but a large storage capacity. The cache medium with small capacity stores frequently used data, so that the device can respond quickly to user needs. [0003] The cache algorithm manages the content in the cache so that more requests can be served in the cache with better performance and avoids retrieving data from the lower-level relatively slow storage, thereby improving the performance of the entire system. Caches are widely used in databases, virtual content management, storage systems, etc. [0004] In the prior art, th...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F12/12G06F12/121
Inventor 姜继熊劲蒋德钧
Owner HUAWEI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products