Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-level cache method based on historical upgrading and downgrading frequency

A cache and frequency technology, applied in the field of data reading and writing and storage of computer systems, can solve problems such as insufficient description of data block status, ignoring of valuable historical implicit information, etc., so as to reduce average response time and space consumption. Small, the effect of reducing bandwidth usage

Active Publication Date: 2015-08-12
SHANGHAI JIAO TONG UNIV
View PDF5 Cites 19 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] However, existing multi-level caching algorithms still have potential for improvement
On the one hand, most multi-level caching algorithms use implicitly stored centralized information of data blocks, but valuable historical implicit information is ignored; on the other hand, some algorithms only save the implicit information of the last few operations, and does not adequately describe the state of the data block

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-level cache method based on historical upgrading and downgrading frequency
  • Multi-level cache method based on historical upgrading and downgrading frequency
  • Multi-level cache method based on historical upgrading and downgrading frequency

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032] In order to make the above objects, features and advantages of the present invention more comprehensible, the present invention will be further described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0033] The present invention provides a multi-level caching method based on historical de-escalation frequency, including:

[0034] Step S1, according to the implicit frequency of each data block in the cache at all levels, wherein the implicit frequency is the sum of the number of times each data block is upgraded and downgraded per unit time;

[0035] Step S2, high implicit frequency queues and low implicit frequency queues are established in caches at all levels, wherein the high implicit frequency queues store data blocks with high implicit frequency, and the low implicit frequency queues store data blocks with the lowest implicit frequency Frequency data blocks; here, in each level of cache of the multi-level cache system, tw...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a multi-lever cache method based on historical upgrading and downgrading frequency. The multi-lever cache method based on the historical upgrading and downgrading frequency is based on historical hinting information of a data block, and the historical hinting information is one of the key points of a multi-lever cache system. Through the hinting frequency, a hot data block can be identified effectively, the hot data block can be stored in the cache at a higher lever for a longer time, the data block hit rate in the system is increased, and the average response time is reduced; according to the multi-lever cache method based on the historical upgrading and downgrading frequency, a traditional LRU stack is divided into two special queues, so that the hinting information has locality, mixing of the hot data block and a cool data block is avoided, and the use of band width between the caches in all levers; the hot data block is identified effectively, the hot data block can be stored in the cache at the higher lever for a long time through an algorithm, the downgrading and upgrading operation of the caches in all levers is reduced, and the band width consumption between the caches is reduced further; the space consumption is small, and basis is provided for a better read-write property of the system under different loads.

Description

technical field [0001] The invention relates to the field of data reading and writing and storage of computer systems, in particular to a multi-level cache method based on historical de-escalation and de-escalation frequencies. Background technique [0002] In a large data center, heterogeneous storage devices work together to accelerate data read and write operations. Characteristically, the high-level storage device acts as a cache for the low-level storage device, forming a distributed multi-level cache system. In recent years, multi-level cache systems have received increasing attention due to their high I / O performance, low monetary cost, and high flexibility. [0003] In the past two decades, many typical multi-level caching solutions have been proposed to improve the I / O performance of storage systems. One of the most effective methods is to build another cache between different levels, and use implicit identification of hot data blocks in this cache. These hints p...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F12/12
Inventor 李颉吴晨涛过敏意何绪斌冯博黄洵松
Owner SHANGHAI JIAO TONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products