Memory system cache mechanism based on flash memory

A flash memory and cache technology, applied in the field of memory system cache mechanism, can solve the problems of not being able to identify the cacheline, not being able to identify the cacheline, reducing the overall system performance and service life, etc., so as to reduce the number of write requests, reduce time overhead, and reduce overhead small effect

Active Publication Date: 2017-03-22
NAT UNIV OF DEFENSE TECH
View PDF6 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, there are two key issues with this system: performance and longevity
However, if we use a simple caching mechanism to cache pages in flash memory into DRAM, traditional cache replacement algorithms, such as LRU (Least Recently Used), can only identify hot pages and save them in cache, and cannot identify Output the hot cache line in each page
Second, the existing general-purpose cache replacement mechanism only focuses on higher request

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Memory system cache mechanism based on flash memory
  • Memory system cache mechanism based on flash memory
  • Memory system cache mechanism based on flash memory

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0024] figure 1 It is a schematic diagram of unbalanced access to each cache line in each page. In this example, the size of each page is 512B, including 8 cache lines.

[0025] figure 2 It is an architecture diagram and a schematic diagram of a work flow of a memory system cache mechanism based on flash memory adopted in the present invention, and the DRAM cache includes a page buffer area and a cache line buffer area.

[0026] image 3 It is a schematic diagram of the history-aware hot spot identification mechanism. The access information records of the cache line can be stored in the out-of-band area of ​​the flash memory when the page is written back to the flash memory.

[0027] Figure 4 It is a schematic diagram of the principle of the delayed refresh mechanism, and the dirty flag is set for each data page and cache line. The specific execution process is:

[0028] In the first step, the DRAM cache space is divided into a page cache area and a cache line cache are...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a memory system cache mechanism based on flash memory. It is provided that two cache areas are constructed in the DRAM of the memory system based on the flash memory, namely a page cache area and a cache line cache area. On one hand, hot cache line in each data page is identified and stored through a history perception hotspot identification mechanism, and the hit rate of DRAM cache is increased. On the other hand, a delay refresh mechanism is adopted, and a clean data block is preferably removed when the cache area is full so as to reduce writing for the flash memory. Meanwhile, a weak variable coefficient algorithm is adopted, modes of history visit records, dirty mark bits and the like are added, and the expenses of the cache mechanism on time and space are not high. The memory system cache mechanism based on the flash memory utilizes the nature of the flash memory, writing delay of the memory system can be effectively improved, and service life of the memory system can be effectively prolonged.

Description

technical field [0001] The present invention is applicable to the technical field of memory systems based on flash memory, and provides a memory system cache mechanism based on flash memory. Through the design of the cache, the load bottleneck when data accesses flash memory is alleviated, and the read-write delay and the delay of reading and writing of the memory system based on flash memory are improved. service life. Background technique [0002] With the rapid development of the information technology revolution, big data and cloud computing have become the mainstream of today's era. The explosive growth of data and the continuous improvement of computer performance have put forward higher and higher requirements for storage systems. Storage systems are facing capacity and performance challenge. [0003] The integration of traditional memory DRAM is getting higher and higher, and its capacity is getting bigger and bigger. Although its access delay has not been significa...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F12/0882G06F12/123
CPCG06F12/0882G06F12/123
Inventor 肖侬陈正国陈志广刘芳陈微欧洋张航邢玉轩
Owner NAT UNIV OF DEFENSE TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products