Method and equipment for managing hybrid cache

A cache and device technology, applied in the information field, can solve the problem of low cache hit rate

Active Publication Date: 2014-10-08
XFUSION DIGITAL TECH CO LTD
View PDF6 Cites 23 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] Embodiments of the present invention provide a method and device for managing hy

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and equipment for managing hybrid cache
  • Method and equipment for managing hybrid cache
  • Method and equipment for managing hybrid cache

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0089] figure 1 It is a schematic diagram of the framework applicable to Embodiment 1 of the present invention.

[0090] like figure 1 As shown, in this architecture, the storage system includes RAM110, SSD120, and disk system, wherein, in order to ensure performance and reliability, the disk system generally forms RAID (of course, if the above advantages are not considered, RAID may not be constructed). An example is represented by RAID130 in the figure. RAM 110 and SSD 120 together form a mixed cache of RAID 130 .

[0091] figure 1 Among them, the cache management module 140 can manage the hybrid cache and RAID130. The cache management module is a logically divided module, and there are various forms of its realization. For example, the cache management module may be a software module running on the host, and is used to manage storage devices directly connected to the host (Direct Attached) or through a network (such as Storage Attached Network), including RAM and SSD s...

Embodiment 2

[0133] Based on Embodiment 1, the embodiment of the present invention specifically describes the above solution through a specific execution process, which specifically includes the following steps:

[0134] image 3 is a schematic flowchart of a method for managing a hybrid cache according to an embodiment of the present invention. image 3 The method is executed by the device that manages the mixed cache, for example it can be figure 1 The cache management module 140 shown in .

[0135] Hybrid cache includes RAM and SSD. RAM and SSD work together as the RAID cache.

[0136] 210. Receive the current I / O request of the application layer.

[0137] 220. Determine the hit result of the current I / O request, where the hit result is used to indicate whether the I / O request hits one of the first queue, the second queue, the third queue, the fourth queue, and the fifth queue, where the first The queue is used to record the information of the first part of data blocks in RAM, the ...

Embodiment 3

[0347] see Figure 16 , based on the above-mentioned embodiments, an embodiment of the present invention provides a device 300 for managing a hybrid cache, wherein the hybrid cache includes a random access memory (RAM) and a solid-state memory (SSD), and the RAM and SSD together serve as a disk system composed of one or more disks the cache;

[0348] The equipment includes:

[0349] The generating unit 301 is configured to generate a second queue, a third queue, and a fifth queue, wherein the second queue is used to manage hot clean data blocks in the RAM, and the hot clean data blocks are found through the second queue; the third queue Used to manage dirty data blocks in RAM, dirty data blocks are found through the third queue; the sum of the lengths of the second queue and the third queue remains unchanged; the fifth queue is used to manage data blocks in SSD;

[0350] Elimination unit 302, when there is a new data block that needs to be managed by the second queue or the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The embodiment of the invention provides a method for managing a hybrid cache. The hybrid cache comprises an RAM and a solid state memory SSD. The method comprises the steps that a second queue and a third queue are generated, wherein the second queue is used for managing hot clean data blocks in the RAM; the third queue is used for managing dirty data blocks in the RAM; the sum of the length of the second queue and the length of the third queue is unchanged; the elimination probability of the second queue is higher than that of the third queue, the length of the second queue dynamically changes along with the elimination operation executed on the second queue, and the length of the third queue dynamically changes along with the elimination operation executed on the third queue; whether the access frequency of data blocks eliminated from the second queue or the third queue exceeds an access frequency threshold value or not is determined; if yes, the data blocks are judged as long-term hot data blocks, the eliminated data blocks are managed by using a fifth queue and are written in the SSD, and the fifth queue is used for managing the data blocks in the SSD.

Description

technical field [0001] The present invention relates to the field of information technology, and in particular, to a method and device for managing a hybrid cache. Background technique [0002] At present, in order to improve the input / output (I / O) performance of the disk system, such as shortening the request response time, increasing the throughput rate, etc., most storage systems adopt caching technology at the upper layer of the disk system. [0003] For example, in terms of storage structure, a typical structure is "random access memory (Random Access Memory, RAM) + cheap redundant disk array (Redundant Arrays of Inexpensive Disks, RAID)", in which only RAM is used as the cache of RAID . This caching device consists of a medium called a single-mass cache. [0004] Recently, due to the widespread application of solid state memory (Solid State Disk, SSD), people have proposed a three-level storage structure - "RAM+SSD+RAID", wherein RAM and SSD work together as a RAID c...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F12/08G06F3/06
Inventor 万继光马晓慧程龙
Owner XFUSION DIGITAL TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products