Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Cache replacement system and method based on instruction stream and memory access mode learning

A technology of cache replacement and pattern learning, applied in memory systems, machine learning, computing models, etc., can solve the problems of reducing uncertainty and difficulty in learning memory access behavior patterns, and achieve the effect of reducing uncertainty and avoiding interference

Pending Publication Date: 2022-01-28
SHANGHAI ADVANCED RES INST CHINESE ACADEMY OF SCI
View PDF0 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

It is difficult to learn the memory access behavior pattern based on such a complex and changeable memory access sequence, and it is necessary to face great uncertainty
[0005] None of the existing cache replacement methods can effectively reduce these uncertainties

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cache replacement system and method based on instruction stream and memory access mode learning
  • Cache replacement system and method based on instruction stream and memory access mode learning
  • Cache replacement system and method based on instruction stream and memory access mode learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0058] The present invention will be described in further detail below through specific implementation examples and accompanying drawings.

[0059] The cache replacement system and method based on instruction stream and memory access mode learning of the present invention are applicable to figure 1 in the processor microarchitecture shown. The processor microarchitecture includes at least five stages of instruction fetching, decoding, execution, memory access, and write-back, and these five stages correspond to the instruction fetch unit 100, the decoding unit 200, the execution unit 300, and the memory access unit 400 and write back to unit 500. The invention is equally applicable to figure 1 In a more complex processor microarchitecture with the functions shown, a certain stage may be refined in a more complex processor microarchitecture, for example, the execution stage may be split into three sub-stages: renaming, scheduling, and execution.

[0060] Such as figure 2 A...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a cache replacement system based on instruction stream and memory access mode learning, and the system comprises a branch prediction module for predicting the instruction stream and writing the instruction stream into an instruction fetch address queue; a memory access instruction recording module which is used for recording information of the submitted memory access instructions in sequence and writing the information into a memory access instruction buffer of the memory access instruction recording module, and querying to obtain a memory access instruction sequence; a memory access mode learning module which records the memory access instruction sequence in a memory access historical buffer, learns a memory access mode according to the memory access instruction sequence, predicts a memory access physical address of the memory access instruction and writes the memory access physical address into a memory access address queue; and a cache replacement decision module which is used for receiving the physical address of the cache replacement candidate item, searching the fetch address queue or the access address queue by using the physical address, and selecting one cache replacement candidate item as a kick-out item according to the obtained reuse distance and feeding back the kick-out item to the first-level cache. The invention further provides a corresponding method. According to the cache replacement system, the interference of out-of-order execution and cache prefetching can be avoided, and the accuracy of memory access sequence prediction is improved.

Description

technical field [0001] The invention relates to the technical field of computer architecture, in particular to a cache replacement system and method based on instruction stream and memory access pattern learning. Background technique [0002] Cache is an important mechanism in modern processors. Commonly used data is copied from the memory to the cache, and subsequent data access can be directly read from the cache, thereby reducing the number of accesses to slow DRAM memory and improving processor performance. The capacity of the cache is limited, and the replacement of the cache content in actual use is inevitable, and the performance of the cache is greatly affected by the cache replacement strategy. [0003] The cache access behavior is complex, and the memory access sequence received by the cache system in the processor will be disturbed by the out-of-order execution of the processor and the cache prefetch mechanism, which further increases the difficulty of predicting ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F12/02G06F9/30G06N20/00G06F12/0862
CPCG06F12/0862G06F12/0897G06F9/3806G06F9/3842
Inventor 王玉庆杨秋松李明树
Owner SHANGHAI ADVANCED RES INST CHINESE ACADEMY OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products