Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Data caching method, cache and computer system

A data cache and data technology, applied in the storage field, can solve the problem of long NVM read and write delay, and achieve the effect of reducing delay and improving access efficiency.

Active Publication Date: 2018-04-10
HUAWEI TECH CO LTD +1
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

But compared with DRAM, NVM has a longer read and write latency

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Data caching method, cache and computer system
  • Data caching method, cache and computer system
  • Data caching method, cache and computer system

Examples

Experimental program
Comparison scheme
Effect test

example 1

[0085] Example 1: After the Cache searches locally, it does not find the corresponding data, indicating a miss. Then, the Cache searches the first 5 Cache lines of the LRU linked list, and determines that no Cache with the memory type of DRAM is found according to the location bits of the first 5 Cache lines. line, the Cache checks the Modify bits of the first three Cache lines in the LRU linked list, and finds that the Modify bit of the second Cache line is 0, which means clean, and this Cache line is the first Cache line to be replaced. The Cache may read the Cache line containing the data whose access address is 0x08000000 into the Cache to replace the first Cache line, and return the read data to the CPU. Moreover, Cache can judge that data is stored in DRAM according to the access address (0x08000000), so the newly read Cache line, that is, the second Cache line, is added to the end of the LRU linked list, and its location position is set to 0 (to prove that the first Cach...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Embodiments of the present invention provide a data cache method, cache and computer system. In the embodiment of the present invention, when the Cache needs to determine the Cache line to be replaced when the access request misses, not only the historical access frequency of the Cache line, but also the memory type corresponding to the Cache line should be considered, so that the replacement corresponding to the DRAM memory type can be prioritized The Cache line reduces the cache amount of data stored in DRAM by Cache, so that Cache can increase the cache amount of data stored in NVM, so that the access requests for data stored in NVM can be in Cache as much as possible Find the corresponding data, thereby reducing the situation of reading data from NVM, reducing the delay of reading data from NVM, and effectively improving access efficiency.

Description

technical field [0001] Embodiments of the present invention relate to storage technology, and in particular, to a data cache method, cache and computer system. Background technique [0002] At present, more and more applications are centered on data, such as Internet applications and big data applications. These applications require powerful storage support. [0003] In the prior art, a dynamic random-access memory (Dynamic Random-Access Memory, hereinafter referred to as DRAM) is generally used as a computer system. However, limited by the process, DRAM has a small capacity and high energy consumption, and it is difficult to meet the application requirements for large capacity and low energy consumption. In recent years, non-volatile memory (Non-Volatile Memory, hereinafter referred to as: NVM) has been widely used. It has the advantages of large storage capacity and low energy consumption. Using NVM to replace DRAM as a computer system can meet the needs of applications ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F3/06G06F12/121
CPCG06F3/061G06F3/0631G06F3/068G06F3/0685G06F12/08G06F12/0802G06F3/0619G06F3/065G06F12/0868G06F12/122G06F12/128G06F2212/604
Inventor 魏巍张立新熊劲蒋德钧
Owner HUAWEI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products