Method for prefetching CPU cache data based on merged address difference value sequence

A CPU cache and data prefetching technology, applied in electrical digital data processing, memory systems, instruments, etc., to improve prediction accuracy, reduce cache miss rate, and improve coverage.

Active Publication Date: 2021-11-16
SHANGHAI ADVANCED RES INST CHINESE ACADEMY OF SCI
View PDF7 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to provide a CPU cache data prefetching method based on combined address difference sequences, to solve the problem in the prior art that multi-table cascading is required to store multi-length difference sequences, and only one table is used to save, simplifying The storage and query logic of the access mode

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for prefetching CPU cache data based on merged address difference value sequence
  • Method for prefetching CPU cache data based on merged address difference value sequence
  • Method for prefetching CPU cache data based on merged address difference value sequence

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0038] The present invention will be described in further detail below through specific examples and accompanying drawings.

[0039] figure 1 The applicable environment and design architecture diagram of the CPU cache data prefetching method based on the merged address difference sequence of the present invention are shown. Wherein, the CPU cache data prefetching method based on the merged address difference sequence of the present invention is applicable to L1 of a core in a traditional CPU (in this embodiment, the core is the No. 0 core Core0 in the CPU) Level data cache can also be applied to L2 and L3 level data cache, and the corresponding performance will be reduced. And each core in the traditional CPU can adopt a set of independent CPU cache data prefetching method based on the combined address difference sequence of the present invention to realize data prefetching, thereby realizing each core in the traditional CPU data prefetch.

[0040] The CPU cache data prefet...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a CPU cache data prefetching method based on a merged address difference value sequence, which comprises the following steps of: collecting current memory access information of a to-be-prefetched data cache, and trying to obtain information from a historical information table to update and obtain a current difference value sequence section when the current memory access information is collected; according to the current difference value sequence segment, updating the historical information table, the difference value mapping array and the difference value sequence segment sub-table, and removing a first difference value to obtain a to-be-predicted difference value sequence; performing multiple matching on prefix subsequences of complete sequence segments stored in the dynamic mapping mode table by using the to-be-predicted difference value sequence to obtain an optimally matched complete sequence segment and a corresponding prediction target difference value; and adding the predicted target difference value to a memory access address in the current memory access information to obtain a predicted target address. The problem that in the prior art, multi-table cascading is needed to store the multi-length difference value sequence is solved, only one table is used for storage, and the storage and query logic of a memory access mode is simplified.

Description

technical field [0001] The invention belongs to the technical field of CPU chip hardware architecture, and in particular relates to a method for prefetching CPU cache data based on a combined address difference sequence. Background technique [0002] The high latency of CPU memory access is one of the main bottlenecks hindering its performance improvement. The cache data prefetcher predicts the data address required by the CPU for calculation, and loads the data from the memory into the cache in advance, thereby reducing the average memory access delay and improving the overall performance of the CPU. Among them, coverage, accuracy and timeliness are the three major elements to measure the performance of the prefetcher. [0003] Currently, mainstream prefetching method designs are mainly divided into two categories: Spatial prefetching and Temporal prefetching. Among them, Temporal prefetching has not been widely adopted in the industry because it requires too much hardwar...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F12/0862G06F12/0882G06F12/0811G06F12/0842
CPCG06F12/0862G06F12/0882G06F12/0811G06F12/0842G06F2212/1016Y02D10/00
Inventor 蒋实知慈轶为杨秋松李明树
Owner SHANGHAI ADVANCED RES INST CHINESE ACADEMY OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products