Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Implementation method of least recently used (LRU) policy in solid state drive (SSD)-based high-capacity cache

An implementation method and large-capacity technology, applied in data transformation, instrumentation, electrical digital data processing, etc., can solve problems such as inapplicability, and achieve the effect of simple and high reading and writing speed

Active Publication Date: 2013-06-12
NAT UNIV OF DEFENSE TECH
View PDF3 Cites 23 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

For large SSD-based caches, this approach still doesn't work

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Implementation method of least recently used (LRU) policy in solid state drive (SSD)-based high-capacity cache

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0039] Such as figure 1 As shown, the implementation steps of the LRU policy implementation method in the SSD-based large-capacity cache in this embodiment are as follows:

[0040] 1) Allocate a continuous address space on the SSD to initialize the FIFO queue; establish the first counting Bloom selector CBF in the memory for recording the logical address of the disk that has only been accessed once 1 And the second counting Bloom selector CBF for recording the logical address of the disk that has been accessed more than twice 2 The data structure, respectively apply for two address spaces in the memory as the logical address buffer of the disk to be written and the logical address buffer of the disk to be replaced, and jump to the next step.

[0041] In this embodiment, the FIFO queue is stored in a piece of continuous address space in the SSD. This piece of continuous address space is recycled. The size of the continuous address space depends on the length of the logical add...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an implementation method of a least recently used (LRU) policy in a solid state drive (SSD)-based high-capacity cache. According to the method, logic addresses of an upper-layer application read-write request are combined into a first in first out (FIFO) queue; two counting Bloom filters are respectively used for recording the logic addresses which are accessed for one or more times in the FIFO queue; and the FIFO queue and the two counting Bloom filters can accurately simulate the behavior of an LRU queue. The FIFO queue is stored on an SSD, and does not occupy a memory space. The two counting Bloom filters are stored in a memory, and occupy a very small memory space. Functions of the LRU queue are realized with extremely-low memory overhead. The implementation method has the advantages of simplicity in implementation, quickness in operation, small storage occupation space and low memory overhead.

Description

technical field [0001] The invention relates to the field of computer storage, in particular to a method for implementing a low-overhead LRU strategy in an SSD-based large-capacity cache. Background technique [0002] Cache is a mechanism based on the principle of locality, which uses small-capacity high-speed storage devices to save recently frequently used data, thereby improving the performance of the entire storage system. It is widely used in computer systems because it is simple, effective, cost-effective, and transparent to upper-level applications. The cache only saves recently frequently accessed data, and data that is no longer frequently accessed will be replaced from the cache. The mechanism for identifying infrequently accessed data is called a cache replacement strategy. LRU (Least Recently Used) is a basic cache replacement strategy. It is widely adopted because it accurately reflects the principle of locality and forms the basis of most complex cache replac...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F5/16
Inventor 肖侬卢宇彤陈志广周恩强刘芳所光谢旻董勇张伟
Owner NAT UNIV OF DEFENSE TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products