Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Replacement method and system for cached data in storage system, and storage system

A storage system and data caching technology, applied in the storage field, can solve problems such as access performance jitter and reduce system access efficiency, so as to avoid jitter problems and improve access efficiency.

Inactive Publication Date: 2018-02-16
ZHENGZHOU YUNHAI INFORMATION TECH CO LTD
View PDF6 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the LRU replacement algorithm only considers the time factor, resulting in hot data (cache data with high access heat, that is, more hit times) being frequently replaced, resulting in IO (Input / Output, input / output) access The performance jitter problem reduces the access efficiency of the system

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Replacement method and system for cached data in storage system, and storage system
  • Replacement method and system for cached data in storage system, and storage system
  • Replacement method and system for cached data in storage system, and storage system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] The core of the present invention is to provide a replacement method, system and storage system for cached data in a storage system, which not only considers the factor of time, but also considers the number of hits of the cached data, and moves the cached data with a large number of hits to a higher level The queue reduces the frequency of hot data replacement, thereby avoiding the jitter problem of IO access performance and improving the access efficiency of the system.

[0031] In order to make the purpose, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the drawings in the embodiments of the present invention. Obviously, the described embodiments It is a part of embodiments of the present invention, but not all embodiments. Based on the embodiments of the present invention, all other embodiment...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a replacement method and system for cached data in a storage system, and the storage system. The method comprises the steps of dividing a cache space of the storage system intoN cache modules in advance, and when the hit cache module exists in the N cache modules, accumulating a hit count of the hit cache module; inserting the hit cache module into an M-level queue with preset queue lengths of all levels, wherein the cache module hit for the first time is inserted in the first-level queue, and the queue of each level performs enqueuing operation and dequeuing operationaccording to an LRU replacement algorithm; and when the re-hit cache module exists in the ith-level queue, judging whether the cache module meets a condition of starting the (i+1)th-level queue or not according to a preset hit grading rule, and if yes, moving the cache module to the (i+1)th-level queue, or otherwise, performing queue adjustment on the ith-level queue according to the LRU replacement algorithm. The jitter problem of IO access performance is avoided; and the system access efficiency is improved.

Description

technical field [0001] The present invention relates to the field of storage technologies, in particular to a replacement method, system and storage system for cached data in a storage system. Background technique [0002] In the storage system, the cached data in the cache space will continuously insert new data and replace the cached data accordingly. In the prior art, an LRU (Least recently used, least recently used) replacement algorithm is generally used to replace cached data in the cache space. Please refer to figure 1 , figure 1 It is a schematic diagram of an LRU replacement algorithm in the prior art. figure 1 A linked list is used to save the cached data, and the newly inserted data will be inserted into the head of the linked list. Whenever the cache hits (that is, the cached data is accessed), the hit cached data is moved to the head of the linked list. When the linked list is full , to eliminate the cached data at the end of the linked list. However, the L...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F12/123G06F12/126
CPCG06F12/124G06F12/126G06F2212/1021
Inventor 王永刚
Owner ZHENGZHOU YUNHAI INFORMATION TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products