Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Cache elimination method and system

A technology of caching and caching data, which is applied in the field of data processing to achieve the effect of improving accuracy

Active Publication Date: 2022-08-02
SHANGHAI BILIBILI TECH CO LTD
View PDF10 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The main purpose of this application is to propose a cache elimination method, system, electronic device, and computer-readable storage medium, aiming to solve the problem of how to effectively eliminate cache data without requiring a large amount of data access

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cache elimination method and system
  • Cache elimination method and system
  • Cache elimination method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0049] like figure 2 As shown, it is a flowchart of a cache elimination method proposed in the first embodiment of the present application. It can be understood that the flowchart in this embodiment of the method is not used to limit the sequence of executing steps. According to needs, some steps in the flowchart can also be added or deleted.

[0050] The method includes the following steps:

[0051] S200, set a first queue and a second queue.

[0052] In this embodiment, the first queue is used to maintain cached data and a unique code (Key) corresponding to each of the data, and the second queue is used to maintain the code and the per-second data of each of the data. query rate. Cache data locally or in the service (the first queue), which can quickly respond to requests.

[0053] see image 3 As shown, it is a schematic diagram of the first queue and the second queue. The first queue and the second queue are both LRU queues. When inserting (Insert), data will be pl...

Embodiment 2

[0072] like Figure 5 As shown, a schematic diagram of a hardware architecture of an electronic device 2 is provided for the third embodiment of the present application. In this embodiment, the electronic device 2 may include, but is not limited to, a memory 21 , a processor 22 , and a network interface 23 that can be communicatively connected to each other through a system bus. It should be pointed out that, Figure 5 Only the electronic device 2 is shown with components 21-23, but it should be understood that implementation of all shown components is not a requirement and that more or fewer components may be implemented instead.

[0073] The memory 21 includes at least one type of readable storage medium, and the readable storage medium includes flash memory, hard disk, multimedia card, card-type memory (for example, SD or DX memory, etc.), random access memory (RAM), static Random Access Memory (SRAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only M...

Embodiment 3

[0077] like Image 6 As shown, a schematic block diagram of a cache elimination system 60 is provided for the third embodiment of the present application. The cache elimination system 60 may be divided into one or more program modules, and the one or more program modules are stored in a storage medium and executed by one or more processors to complete the embodiments of the present application. The program modules referred to in the embodiments of the present application refer to a series of computer program instruction segments capable of accomplishing specific functions. The following description will specifically introduce the functions of each program module in this embodiment.

[0078] In this embodiment, the cache elimination system 60 includes:

[0079] The setting module 600 is used for setting the first queue and the second queue.

[0080] In this embodiment, the first queue is used to maintain the cached data and the unique code corresponding to each of the data, a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present application discloses a cache elimination method, which includes: setting a first queue and a second queue, wherein the first queue is used to maintain cached data and a unique code corresponding to each of the data, the second queue The queue is used to maintain the code and the query rate per second of each of the data; receive the code of the data to be queried; initiate a query operation to the first queue according to the code, and if the code is not queried The first queue and the second queue are updated during encoding, and the cold data in the first queue is eliminated. The present application also discloses a cache elimination system, an electronic device and a computer-readable storage medium. In this way, hot data sets, warm data sets and cold data sets in the queue can be effectively distinguished, the accuracy of elimination can be improved, and cached data of historical access records can be eliminated without a large number of data accesses.

Description

technical field [0001] The present application relates to the technical field of data processing, and in particular, to a cache elimination method, system, electronic device, and computer-readable storage medium. Background technique [0002] The cache elimination algorithm is an elimination algorithm to make full use of cached data. In order to maximize the page hit rate, most operating systems widely use the LRU (Least Recently Used, least recently used) elimination algorithm. The LRU elimination algorithm maintains a queue. The Insert (insert) operation will put the data at the head of the queue, and the elimination will start from the tail of the queue, and the Key (the unique ID number of the data resource) obtained by the Lookup (query) operation will also be updated. , transfer the Key to the head of the queue. The K in the LRU-K elimination algorithm represents the threshold of the number of recent uses, which is an elimination algorithm that only enters the LRU qu...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F12/123
CPCG06F12/123
Inventor 蔡尚志王盛
Owner SHANGHAI BILIBILI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products