Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Dram/nvm hierarchical heterogeneous memory access method and system with software-hardware cooperative management

a hierarchical heterogeneous memory and access method technology, applied in the field of cache performance optimization, can solve the problems of large hardware cost, large hardware cost, and relatively high read/write delay, so as to reduce memory access delay, eliminate hardware cost, and eliminate hardware cost

Active Publication Date: 2017-09-28
HUAZHONG UNIV OF SCI & TECH
View PDF1 Cites 24 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

The present invention provides a new DRAM / NVM hierarchical memory system with software-hardware cooperative management. It aims to reduce hardware costs and memory access delay in a conventional system. The invention monitors memory accesses using some reserved bits in TLB, and eliminates hardware costs compared with the conventional system that monitors the page access frequency in the memory controller. It also develops a utility-based cache fetching method that dynamically adjusts the fetching threshold according to application memory access locality and utilization of the DRAM cache, improves utilization of the DRAM cache and bandwidth usage between the NVM main memory and the DRAM cache. Overall, the invention has the advantages of eliminating large hardware costs, reducing memory access delay, improving cache utilization, and enhancing bandwidth usage between NVM and DRAM.

Problems solved by technology

With the development of the multi-core and multi-threading technology, Dynamic Random Access Memory (DRAM) can no longer meet the growing memory demand of applications due to restrictions in terms of power consumption and techniques.
However, compared with the DRAM, these new non-volatile memories still have a lot of disadvantages: (1) a relatively high read / write delay, where the read speed is approximate twice slower than that of DRAM, and the write speed is almost five times slower than that of the DRAM; (2) high write power consumption; and (3) a limited endurance life.
Therefore, it is unfeasible to directly use these emerging non-volatile memories as the computer main memory.
Therefore, one page migration operation may generate four times of page replication, and thus the time cost of the migration operations are relatively large because the reading phase and the writing phase are performed sequentially.
Besides, if a memory system supports 2 MB or 4 MB superpage to reduce TLB miss rate, the hot page migration mechanism can leads to tremendous time and space overhead.
This implies that the hierarchical heterogeneous memory system has a relatively long access delay when a DRAM cache miss occurs.
In big data environments, a lot of applications have poor temporal / spatial locality, and such data fetching mechanism would aggravate cache pollution.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Dram/nvm hierarchical heterogeneous memory access method and system with software-hardware cooperative management
  • Dram/nvm hierarchical heterogeneous memory access method and system with software-hardware cooperative management
  • Dram/nvm hierarchical heterogeneous memory access method and system with software-hardware cooperative management

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029]To illustrate the objectives, technical solutions and advantages of the presented invention more clearly, the following further describes the details of this invention with figures and case studies. It should be noted that, the specific cases described in this invention are only used to illustrate the present invention rather than limiting the application scenarios of this invention.

[0030]FIG. 1 shows system architecture of a DRAM / NVM hierarchical heterogeneous memory system with software-hardware cooperative management. The hardware layer includes a modified TLB, and the software layer includes extended page table, a utility-based data fetching module, and a DRAM cache management module. The extended page table is mainly used to manage the mapping from virtual pages to physical pages and the mapping from NVM memory pages to DRAM cache pages. The modified TLB caches page table entries that are frequently accessed in the extended page table, thereby improve the efficiency of ad...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention provides a DRAM / NVM hierarchical heterogeneous memory system with software-hardware cooperative management schemes. In the system, NVM is used as large-capacity main memory, and DRAM is used as a cache to the NVM. Some reserved bits in the data structure of TLB and last-level page table are employed effectively to eliminate hardware costs in the conventional hardware-managed hierarchical memory architecture. The cache management in such a heterogeneous memory system is pushed to the software level. Moreover, the invention is able to reduce memory access latency in case of last-level cache misses. Considering that many applications have relatively poor data locality in big data application environments, the conventional demand-based data fetching policy for DRAM cache can aggravates cache pollution. In the present invention, an utility-based data fetching mechanism is adopted in the DRAM / NVM hierarchical memory system, and it determines whether data in the NVM should be cached in the DRAM according to current DRAM memory utilization and application memory access patterns. It improves the efficiency of the DRAM cache and bandwidth usage between the NVM main memory and the DRAM cache.

Description

TECHNICAL FIELD[0001]The present invention belongs to the field of cache performance optimization in a DRAM / NVM heterogeneous memory environment, and in particular, a DRAM / NVM hierarchical heterogeneous memory access method and system with software-hardware cooperative management schemes are designed, and an utility-based data fetching mechanism is proposed in this system.BACKGROUND ART[0002]With the development of the multi-core and multi-threading technology, Dynamic Random Access Memory (DRAM) can no longer meet the growing memory demand of applications due to restrictions in terms of power consumption and techniques. Emerging Non-Volatile Memories (NVMs), such as Phase Change Memory (PCM), Spin Transfer Torque Magneto resistive Random Access Memory (STT-MRAM), and Magnetic Random Access Memory (MRAM), have features such as byte-addressable, comparable read speed with DRAM, near-zero standby power consumption, high density (storing more data per chip), and high scalability, and m...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(United States)
IPC IPC(8): G06F12/1045G06F12/0862
CPCG06F12/1054G06F12/0862G06F2212/1024G06F2212/68G06F2212/22G06F2212/602G06F2212/202G06F12/0238G06F12/1009G06F2212/1041G06F2212/1056G06F2212/222G06F12/1027
Inventor JIN, HAILIAO, XIAOFEILIU, HAIKUNCHEN, YUJIEGUO, RENTONG
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products