Unlock instant, AI-driven research and patent intelligence for your innovation.

Management method and system for a flash cache area

A management method and a technology of a management system, which are applied in the management method and system field of a flash memory cache area, can solve problems such as reducing write operations, eliminating data pages, affecting the hit rate and the operating performance of the cache area, and achieving improved operating performance and improved hits rate effect

Active Publication Date: 2020-04-14
INST OF MICROELECTRONICS CHINESE ACAD OF SCI
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, the LRU algorithm is difficult to eliminate the least recently used data pages. Moreover, flash memory has the characteristics of read and write asymmetry, and an erase operation needs to be performed before the write operation. The write operation takes a longer time than the read operation. , therefore, in cache management, it is desirable to minimize the write operations to the flash memory in order to improve the overall performance of flash operations
[0005] Based on the characteristics of flash memory, a series of improvements have been made to the LRU algorithm. In an existing improved LRU algorithm, a cold-clean linked list, a cold-dirty linked list, and a hot-linked list are respectively established in the cache area. A data page that has performed a read operation is called a cold clean page, and a data page that has only performed one write operation is stored in the cold dirty linked list. This data page is called a cold dirty page, and the read operation is stored in the hot linked list Or for data pages that have been written more than once, including hot clean pages and hot dirty pages, when they are sent from the hot linked list to the cold linked list, they will be searched from the head of the hot linked list, and the clean pages will be sent first. The write operation of flash memory does not really take into account the access frequency and novelty of data, which affects the hit rate and the performance of the cache area
In addition, when replacing a data page, first replace it from the cold clean page, and then replace it in the cold dirty page and the hot clean page through the selection of probability. Since the replacement is performed in a fixed probability method, it may cause Too many data pages in a linked list are eliminated, or the newly entered data pages are eliminated too early, which affects the efficiency and performance of cache operation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Management method and system for a flash cache area
  • Management method and system for a flash cache area
  • Management method and system for a flash cache area

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0049] In order to make the above objects, features and advantages of the present invention more comprehensible, specific implementations of the present invention will be described in detail below in conjunction with the accompanying drawings.

[0050] In the following description, a lot of specific details are set forth in order to fully understand the present invention, but the present invention can also be implemented in other ways different from those described here, and those skilled in the art can do it without departing from the meaning of the present invention. By analogy, the present invention is therefore not limited to the specific examples disclosed below.

[0051] The present invention proposes a management method of a flash memory cache area, which is an improvement to the flash memory-oriented LRU algorithm, including:

[0052] According to the operating characteristics of the flash memory, a cold-clean linked list, a cold-dirty linked list, and a hot-linked lis...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a method and a system for management of a flash memory cache region. The method comprises the steps of establishing three linked lists including a cold clean linked list, a cold dirty linked list and a hot linked list in the cache region; respectively managing a cold clean data page, a cold dirty data page and hot data; and when the hot data are dismissed, judging whether a dismissed data page is needed through a health point in the order from a list header to a list tail, wherein the health point is a numerical value containing access times, novelty and read-write cost. According to the method and the system for management of the flash memory cache region, the accessed frequentness of the data pages, the re-accessed probability and the read-write delay of a flash memory are fully considered, thus the hit rate of data access is improved and the operating performance of the cache region is enhanced.

Description

technical field [0001] The present invention relates to the field of storage systems, in particular to a method and system for managing a flash memory cache area. Background technique [0002] With the continuous development of big data applications, the performance requirements of storage media are getting higher and higher. Flash memory is a representative of new non-volatile storage media. It has the advantages of high read and write speed, low power consumption and shock resistance. It is widely used in in consumer electronics and enterprise storage systems. [0003] LRU (Least Recently Used) is the most basic caching algorithm, that is to give priority to replacing the least recently used cache data page. In this algorithm, the data page is linked to the linked list according to the novelty of the data page. For data pages accessed early, the tail end of the linked list is the most recently accessed data page. When the cached data page is eliminated, it is eliminated f...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F12/0871G06F12/121
CPCG06F12/0871G06F12/121
Inventor 王力玉陈岚
Owner INST OF MICROELECTRONICS CHINESE ACAD OF SCI
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More