SSD-based (Solid State Disk) cache management method and system

A cache management and caching technology, applied in memory systems, electrical digital data processing, memory address/allocation/relocation, etc., can solve the problems of SSD small-grained random write too much, cache pollution, etc.

Active Publication Date: 2012-10-31
INST OF COMPUTING TECH CHINESE ACAD OF SCI +1
View PDF7 Cites 45 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0016] In order to solve the above-mentioned problems, aiming at the performance characteristics of SSD, the purpose of the present invention is to solve the problem of excessive small-grained random writing and serious cache pollution in the SSD of the prior art, and proposes a new cache architecture, which is composed of DRAM and SSD. A system in which DRAM is the first-level cache and SSD is the second-level cache

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • SSD-based (Solid State Disk) cache management method and system
  • SSD-based (Solid State Disk) cache management method and system
  • SSD-based (Solid State Disk) cache management method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0081] Specific embodiments of the present invention are given below, and the present invention is described in detail in conjunction with the accompanying drawings.

[0082] new caching system

[0083] The new cache system is composed of DRAM, SSD, HDD, see the attached figure 2 . SSD is located between DRAM and HDD, and acts as a cache of HDD. Data is persistently stored in HDD, DRAM is used as the first level cache, and SSD is used as the second level cache. DRAM and SSD constitute the two-level cache of HDD. In DRAM, the content stored in the DRAM cache should be recorded. In order to quickly locate whether a page exists in the DRAM cache, the pages stored in the cache are managed in the form of a hash table. At the same time, the content in the SSD should also be recorded. The relevant information of the data in the SSD cache needs to be recorded in the DRAM, which requires the hash table to record the content as well. Therefore, the following information needs to be...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an SSD-based (Solid State Disk) cache management method and system. The SSD-based cache management method comprises the following steps: step 1, sending a read-write request, checking whether a cached DRAM (Dynamic Random Access Memory) hits data, searching a hash list, judging whether the data is existent, reading the data from the cached DRAM and returning the request if the data is existent, and reading data to the cached DRAM from a HDD (Hard Disk Drive) and carrying out step 2 if the data is nonexistent in the cached DRAM; step 2, carrying out data screening by using a two-level LRU (least recently used) linked list and a Ghost buffer, and identifying data heat; and step 3, carrying out self-adaptive change calculation on the length of the two-level LRU linked list, acquiring the granularity of a page cluster when the second-level LRU linked list of the cached DRAM is full, taking C pages behind the second-level LRU end as a whole body to replace the cached DRAM, and writing the C pages into an SSD in a large granularity, wherein the page cluster size is C pages, and C is an integral multiple of Block page number in the SSD.

Description

technical field [0001] The present invention relates to a cache storage structure and strategy, in particular to an SSD-based cache management method and system. Background technique [0002] With the progress of contemporary society, more and more data information needs to be processed, and the amount of data is growing explosively. This brings many problems to traditional storage systems. Traditional storage systems generally consist of memory (DRAM) and hard disks (HDD). DRAM acts as a cache for HDD. Such a system faces the following challenges: [0003] First, the total amount of data is increasing rapidly. The joint report of IDC and EMC pointed out that the data in today's society is growing explosively. From their report, it can be seen that the amount of data before and after 2005 was only tens of exabytes (1EB=1018 bytes), and in 2010 it was It is estimated that by 2015 it will reach nearly 8000EB, that is, the data volume of 8ZB (1ZB=1021 bytes). Faced with such...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F12/08G06F12/0888
Inventor 车玉坤熊劲马久跃
Owner INST OF COMPUTING TECH CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products