Data caching method, cache and computer system

A data cache and data technology, applied in the storage field, can solve the problems of long NVM read and write delays, achieve the effects of reducing delays and improving access efficiency

Active Publication Date: 2015-11-25
HUAWEI TECH CO LTD +1
View PDF5 Cites 23 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

But compared with DRAM, NVM has a longer read and write latency

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Data caching method, cache and computer system
  • Data caching method, cache and computer system
  • Data caching method, cache and computer system

Examples

Experimental program
Comparison scheme
Effect test

example 1

[0085] Example 1: After the Cache searches locally, it does not find the corresponding data, indicating a miss. Then the Cache searches the first 5 Cachelines of the LRU linked list, and according to the location bits of the first 5 Cachelines, it determines that no Cacheline whose memory type is DRAM is found, then Cache checks the Modify bits of the first 3 Cachelines in the LRU linked list, and finds that the Modify bit of the second Cacheline is 0, which means clean, and this Cacheline is the first Cacheline to be replaced. The Cache may read the Cacheline containing the data whose access address is 0x08000000 into the Cache to replace the first Cacheline, and return the read data to the CPU. And, Cache can judge that data is stored in DRAM according to the access address (0x08000000), so the newly read Cacheline, that is, the second Cacheline, is added to the end of the LRU linked list, and its location position is 0 (to prove that the second Cacheline comes from DRAM), a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

A data caching method, a cache and a computer system. In the method, when an access request does not hit a cache line to be replaced which is required to be determined, a cache not only needs to take account of a historical access frequency of the cache line, but also needs to take account of a memory type corresponding to the cache line, so that the cache line corresponding to the memory type of a DRAM can be replaced preferentially, thereby reducing the caching amount of the cache for data stored in the DRAM, and therefore, the cache can increase the caching amount of data stored in an NVM, so that for the access request of the data stored in the NVM, corresponding data can be found in the cache as much as possible, thereby reducing cases of data being read from the NVM, reducing the delay of reading data from the NVM and effectively improving the access efficiency.

Description

technical field [0001] Embodiments of the present invention relate to storage technology, and in particular, to a data cache method, cache and computer system. Background technique [0002] At present, more and more applications are centered on data, such as Internet applications and big data applications. These applications require powerful storage support. [0003] In the prior art, a dynamic random access memory (Dynamic Random-Access Memory, hereinafter referred to as DRAM) is generally used as a computer system. However, limited by the process, DRAM has a small capacity and high energy consumption, and it is difficult to meet the application requirements for large capacity and low energy consumption. In recent years, non-volatile memory (Non-Volatile Memory, hereinafter referred to as: NVM) has been widely used. It has the advantages of large storage capacity and low energy consumption. Using NVM to replace DRAM as a computer system can meet the needs of applications ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/06G06F12/08
CPCG06F3/06G06F12/08G06F3/061G06F3/0631G06F3/068G06F3/0685G06F12/0802G06F3/0619G06F3/065G06F12/0868G06F12/122G06F12/128G06F2212/604
Inventor 魏巍张立新熊劲蒋德钧
Owner HUAWEI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products