Cache memory replacement method and system based on use popularity

A high-speed cache and memory technology, which is applied in memory systems, instruments, and electrical digital data processing, etc., can solve the problems of inability to change the replacement strategy, large resource usage, and low hit rate, and achieve increased replacement flexibility and resource usage. Few, high versatility

Active Publication Date: 2020-04-10
TIH MICROELECTRONIC TECH CO LTD +1
View PDF7 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, an algorithm with a high cache hit rate is generally complex to implement and consumes a lot of resources; while an algorithm with a simple algorithm and a small resource occupa

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cache memory replacement method and system based on use popularity
  • Cache memory replacement method and system based on use popularity

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0040] The present disclosure provides a cache memory replacement method based on usage heat, including:

[0041] S1: if figure 1 As shown, the cache memory Cache is divided into n Cache blocks, and the n Cache blocks and their corresponding heat values ​​form a heat comparison group;

[0042] In this comparison group, there are a total of n blocks for heat comparison, and the comparison block is named B1 ~B n , and its corresponding count value is C 1 ~C n .

[0043] S2: Define parameter values

[0044] a. Initial heat value S: After data replacement, the initial count value of Cx corresponding to the data block Bx.

[0045] b. Heat decay factor b: After the block is not hit, the decreasing value of the heat count; this value can be a fixed value, or it can change with the change of the heat value.

[0046] c. Popularity enhancement factor i: After hitting the block, the increased value of the popular count; this value can be a fixed value, or it can change with the cha...

Embodiment 2

[0059] The present disclosure provides a usage heat based cache memory replacement system comprising:

[0060] A block module, which is used to divide the cache memory Cache into n Cache blocks, and the n Cache blocks and their corresponding heat values ​​form a heat comparison group;

[0061] A data reading module, which is used to judge whether the CPU data to be read exists in the Cache according to the received CPU data reading request, if hit, find the hit Cache block and its corresponding heat value in the heat comparison group, Increase the heat value according to the preset heat enhancement factor, and the heat value of the rest of the miss Cache block is attenuated according to the heat decay factor, and read the CPU data in the hit Cache block;

[0062] Replacement module, which is used to find the Cache block with the smallest heat value and less than or equal to the replacement threshold in the heat comparison group if there is a miss, replace the CPU data to be re...

Embodiment 3

[0064] The present disclosure provides an electronic device, which is characterized in that it includes a memory, a processor, and computer instructions stored in the memory and run on the processor. When the computer instructions are executed by the processor, a high-speed Steps described in the buffer memory replacement method.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a cache memory replacement method and system based on use popularity. The method comprises the steps of: dividing a Cache into n Cache blocks, and enabling the n Cache blocks and corresponding popularity values to form a popularity comparison group, judging whether the CPU data to be read exist in the Cache or not according to a received CPU data reading request, in a hit case, searching for the hit Cache block and the corresponding popularity value, increasing the popularity value according to a preset popularity enhancement factor, attenuating the popularity values ofother un-hit Cache blocks according to a popularity attenuation factor, and reading the CPU data in the hit Cache block, and in a no hit case, searching for the Cache block of which the popularity value is minimum and is less than or equal to the replacement threshold, replacing the CPU data to be read into the Cache block, and attenuating the popularity values of the rest un-hit Cache blocks according to the popularity attenuation factor. The access frequency of the code in the Cache is counted, and replacement is carried out after the access popularity is reduced; and the method is suitablefor different execution codes through heat enhancement factor and heat attenuation factor parameters, and a high hit rate is kept.

Description

technical field [0001] The present disclosure relates to the technical field of data storage and reading, and in particular to a cache replacement method and system based on usage heat. Background technique [0002] The statements in this section merely provide background information related to the present disclosure and do not necessarily constitute prior art. [0003] Cache memory (Cache) is a storage system located between the CPU (Central Processing Unit) and DRAM (Dynamic Random Access Memory) or flash (flash memory). Generally, Cache has a smaller capacity and faster speed than DRAM or flash. The speed of the CPU is much higher than that of the memory. When the CPU directly accesses data from the memory, it has to wait for a certain period of time, while the Cache can save a part of the data that the CPU has just used or recycled. If the CPU needs to use this part of the data again, it can Call directly from the Cache, thus avoiding repeated access to data, reducing t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F12/0808
CPCG06F12/0808
Inventor 刘超张洪柳于秀龙
Owner TIH MICROELECTRONIC TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products