Check patentability & draft patents in minutes with Patsnap Eureka AI!

High-performance-oriented intelligent cache replacement strategy adaptive to prefetching

A cache replacement, high-performance technology, applied in memory systems, memory address/allocation/relocation, instruments, etc., can solve problems such as performance gain decline, and achieve the effect of reducing interference, avoiding cache pollution, and achieving good performance advantages

Pending Publication Date: 2021-08-24
BEIJING UNIV OF TECH
View PDF0 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] In order to solve the problem that hardware prefetchers are generally used in modern high-performance processors, but there is a problem that the latest intelligent cache replacement strategy shows a decline in performance gain during prefetching, the present invention proposes an intelligent cache replacement strategy adapted to prefetch, to request Type (Demand or Prefetch) as Granular for Reuse Prediction

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • High-performance-oriented intelligent cache replacement strategy adaptive to prefetching
  • High-performance-oriented intelligent cache replacement strategy adaptive to prefetching
  • High-performance-oriented intelligent cache replacement strategy adaptive to prefetching

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] In order to make the purpose, technical solutions and advantages of the present invention more clear, the embodiments of the present invention will be described in detail below in conjunction with the accompanying drawings.

[0030] What the present invention relates to is a kind of intelligent cache replacement strategy that adapts to prefetching, such as figure 1 As shown, its main components are the Demand Request Predictor and the Prefetch Request Predictor that make reuse interval predictions, and DMINgen that simulates the Demand-MIN algorithm to provide the input labels for training the predictors. The design mainly includes two parts of training and forecasting of demand predictor and prefetch predictor. The input data of the demand predictor includes the PC address of the load instruction that generates the demand access, and the past PC address stored in the PCHR; the input data of the prefetch predictor includes the PC address of the load instruction that tri...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a high-performance-oriented intelligent cache replacement strategy adaptive to prefetching, in the presence of a hardware prefetching device, a prefetching request and a demand request are distinguished, a prefetching predictor based on an ISVM (Inter Support Vector Machine) is used for carrying out rereference interval prediction on a cache line of prefetching access loading, and a demand predictor based on an ISVM is utilized to carry out rereference interval prediction on a cache line of demand access loading. A PC address of a current access load instruction and a PC address of a past load instruction in an access historical record are input, different ISVM predictors are designed for prefetch and demand requests, reuse prediction is performed on a loaded cache line by taking a request type as granularity, the accuracy of cache line reuse prediction during prefetch is improved, and performance improvement caused by hardware prefetching and cache replacement is better fused.

Description

technical field [0001] The invention belongs to the field of computer system cache system structure, and in particular relates to a high-performance-oriented intelligent cache replacement strategy adapted to prefetching. Background technique [0002] The performance improvement speed of computer memory is far behind the speed of processor performance improvement, forming a "storage wall" that hinders processor performance improvement, making the memory system one of the performance bottlenecks of the entire computer system. The last-level cache (LLC) alleviates the huge difference in latency and bandwidth between the CPU and DRAM, and improving the processor's memory subsystem is the key to alleviating the "memory wall" problem. One approach relies on well-designed cache replacement strategies to efficiently manage on-chip last-level caches, which reduce the number of inserted cache lines by dynamically modifying cache insertions to prioritize data reusability and importance...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F12/0811G06F12/0871G06K9/62G06F12/06
CPCG06F12/0811G06F12/0871G06F12/0646G06F18/214Y02D10/00
Inventor 方娟杨会静滕自怡蔡旻
Owner BEIJING UNIV OF TECH
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More