A method of LRU flash memory cache management based on dynamic page weight

A dynamic page and cache management technology, applied in the field of storage systems, can solve problems such as low hit rate, lack of flash memory cache management methods, and high latency

Active Publication Date: 2020-09-11
HANGZHOU DIANZI UNIV
View PDF11 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, most of the current cache management methods are optimized for hard disk storage devices, and there is a lack of cache area management methods for flash memory.
When the traditional cache management method is applied to flash memory, there is a problem of low hit rate, and the above-mentioned problems of high latency and high write consumption cannot be solved.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A method of LRU flash memory cache management based on dynamic page weight
  • A method of LRU flash memory cache management based on dynamic page weight
  • A method of LRU flash memory cache management based on dynamic page weight

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0063] A dynamic page weight-based flash memory cache management method provided by the present invention will be further described below with reference to the accompanying drawings.

[0064] see figure 1 , shown as a flow chart of the present invention, the specific implementation of the present invention is as follows:

[0065] Step S1: Read the page requests in the request queue and identify and classify the page request type and the area where they are located, specifically including the following steps:

[0066] Step S11: Preprocess the page request queue, set the page request as R, which contains the request number R pid and page request mode R am ∈{read,write}. The request queue S can be expressed by the following formula:

[0067] S={R 1 , R 2 ,...,R j ,...,R n},1≤j≤n

[0068] Request queue S is multiple page requests R pid A collection of , where n represents the total number of requests in the request queue S, and j represents the page request number. Set ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an LRU flash memory cache management method based on dynamic page weight. The method comprises: S1, page requests in a request queue are read, and page request types and areaswhere the page requests are located are identified and classified; and S2, judging the state of inserting the buffer area, judging the obsolete page by using an LRU method based on the dynamic page weight, adjusting the state of the buffer area, and executing a page request. According to the technical scheme, the buffer area type is divided into the working area and the exchange area, the cold page, the hot page, the dirty page and the clean page are distinguished, the page request type and the buffer area are determined, and then the obsoleted page is judged to complete the page request by combining the LRU method based on the dynamic page weight. According to the technical scheme, read-write consumption and time delay in the flash memory cache read-write process can be effectively reduced, and meanwhile the hit rate in flash memory cache read-write can be greatly increased.

Description

technical field [0001] The invention relates to the field of storage systems, in particular to a dynamic page weight-based LRU flash cache management method. Background technique [0002] NAND flash storage technology has been widely used in enterprise applications due to its high performance, small size, and low energy consumption. However, with the continuous development of big data technology in recent years, the processing and analysis of massive data has put forward higher requirements for the data throughput and I / O delay of the storage system. Defects such as symmetric I / O delay and block erasure make it impossible to completely replace hard disk storage. [0003] Combining caching technology with storage devices can effectively reduce I / O latency and reduce asymmetry in different storage layers. However, most of the current cache management methods are optimized for hard disk storage devices, and there is a lack of a cache area management method for flash memory. ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06F12/0871G06F12/123
Inventor 袁友伟陶文鹏张锦涛贾刚勇鄢腊梅
Owner HANGZHOU DIANZI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products