Unlock instant, AI-driven research and patent intelligence for your innovation.

Cache data heat management method and system based on probability-based access frequency count

A technology for accessing frequency and caching data, which is applied in the fields of electrical digital data processing, memory system, computing, etc., can solve the problem of surge in storage demand, achieve wide application prospects, save occupation and CPU computing, and highlight the effect of substantive features

Active Publication Date: 2022-07-08
INSPUR SUZHOU INTELLIGENT TECH CO LTD
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0013] Aiming at the defect that the above-mentioned cache scheduling methods in the prior art cannot satisfy the current situation of limited storage resources and surge in storage demand, the present invention provides a caching data heat management method and system based on probabilistic counting of access frequency to solve the above-mentioned technical problems

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cache data heat management method and system based on probability-based access frequency count
  • Cache data heat management method and system based on probability-based access frequency count
  • Cache data heat management method and system based on probability-based access frequency count

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0075] like figure 1 As shown, the present invention provides a cache data heat management method for counting access frequency based on probability, including the following steps:

[0076] S1. Set the data in the cache to be stored in units of data blocks, and set a heat statistics data structure in the data blocks, and the heat statistics data structure includes an access time record bit and an access frequency record bit;

[0077] S2. When the data block in the cache is accessed, record the current access time of the data block, count the access frequency of the data block based on the probability, and update the access frequency and access time of the data block in the heat statistics data structure;

[0078] S3. When there is a new cache request, determine whether the remaining cache space has reached the set threshold, and when the cache space reaches the set threshold, randomly select a set number of data blocks from the cache, and then select a set number of data block...

Embodiment 2

[0080] like figure 2 As shown, the present invention provides a cache data heat management method for counting access frequency based on probability, including the following steps:

[0081] S1. Set the data in the cache to be stored in units of data blocks, and set a heat statistics data structure in the data blocks, and the heat statistics data structure includes an access time record bit and an access frequency record bit;

[0082] Set the access time record bit in the heat statistics structure to Mbit, and the access frequency record bit to Nbit; the access time record bit in the heat statistics structure is 24bit, and the access frequency record bit is 8bit;

[0083] S2. When the data block in the cache is accessed, record the current access time of the data block, count the access frequency of the data block based on the probability, and update the access frequency and access time of the data block in the heat statistics data structure; the specific steps are as follows:...

Embodiment 3

[0107] like image 3 As shown, the present invention provides a cache data heat management system for counting access frequency based on probability, including:

[0108] The heat statistics structure setting module 1 is used to set the data in the cache to be stored in units of data blocks, and set the heat statistics data structure in the data blocks, and the heat statistics data structure includes the access time record bit and the access frequency record bit; setting In the heat statistics structure, the access time record bit is Mbit, and the access frequency record bit is Nbit;

[0109] The access frequency probability statistics module 2 is used to record the access time of the data block when the data block in the cache is accessed, count the access frequency of the data block based on the probability, and update the access frequency and access time of the data block in the heat statistics data structure ; Access frequency probability statistics module 2 includes:

[01...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention provides a cache data heat management method and system for counting access frequency based on probability. The method: S1. Set data in the cache to be stored in units of data blocks, and set a heat statistics data structure in the data blocks, so that The heat statistics data structure includes access time record bits and access frequency record bits; S2. When the data block in the cache is accessed, record the current access time of the data block, count the access frequency of the data block based on the probability, and update the heat statistics data structure S3. When there is a cache request, determine whether the remaining cache space reaches the set threshold, and when the cache space reaches the set threshold, randomly select a set number of data blocks from the cache, and then Lazy attenuation is performed from the selected data blocks according to their respective access frequencies and access times, and data blocks are deleted from the selected data blocks according to the lazy attenuation results until the remaining cache space is greater than the set threshold.

Description

technical field [0001] The invention belongs to the technical field of storage cache management, and in particular relates to a cache data heat management method and system for counting access frequency based on probability. Background technique [0002] FIFO, which is the abbreviation of First In First out, first in first out, eliminates the most recent pages, and new pages are eliminated at the latest, which is completely in line with the queue. [0003] LRU, the abbreviation of Least recently used, is the least recently used, and the pages that are not used recently are eliminated. [0004] LFU is the abbreviation of Least frequently used, the least recently used, and the least used page is eliminated. [0005] Driven by the two waves of the Internet and the mobile Internet, storage technology has developed rapidly. Mobile Internet users have increased by 10 times in the past ten years, and the growth of users has led to an exponential growth in the amount of data. Beca...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F12/0877G06F12/121
CPCG06F12/0877G06F12/121Y02D10/00
Inventor 于猛孟祥瑞
Owner INSPUR SUZHOU INTELLIGENT TECH CO LTD
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More