Dynamically adjusted cache data management and elimination method

A technology of caching data and dynamic adjustment, which is applied in the computer field and can solve problems such as cache pollution, pollution, and extrusion of hot data, and achieve the effects of improving cache hit rate, avoiding cache pollution, and optimizing performance

Pending Publication Date: 2020-05-15
HANGZHOU DIANZI UNIV
View PDF8 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] But both LFU and LRU have cache pollution problems
LFU has no way to be responsible for some cache items that are often used in the past and are now hardly used. Historical hot data pollutes current hot data; LRU is susceptible to sporadic batch operations. A large amount of temporary data floods into the cache, crowding out hot data, resulting in hot hit rates Sharp drop, severe cache pollution

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Dynamically adjusted cache data management and elimination method
  • Dynamically adjusted cache data management and elimination method
  • Dynamically adjusted cache data management and elimination method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0016] In order to make the purpose, technical solutions and beneficial effects of the present invention more clear, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments.

[0017] like figure 1 As shown, the overall mechanism of the dynamically adjusted cache data management and elimination method is as follows: the memory space is divided into a hot data area and a cold data area, both of which are organized and managed by a two-way linked list, referred to by linked list H and linked list C respectively. When there is a new cache item, put the cache item into the head of the linked list C, and record the hitTimes attribute in the cache item as 1. Linked list C performs cache elimination according to the LRU rule. Add 1 to the hitTimes of the item each time the cache item is hit, and when the hitTimes of a cache item reaches the set value K, delete the item from the linked list C and put it into the hea...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a dynamically adjusted cache data management and elimination method. According to the invention, the position of the cache item in the memory is dynamically adjusted accordingto the access time and hit frequency attribute of the cache item. The memory is divided into a hot data area and a cold data area, the cache items with high hit frequency and short access time are kept at the front part of the hot data area, and the cache items with low hit frequency and long access time are kept at the tail part of the cold data area. When the cache capacity reaches a threshold value and data needs to be eliminated, cache items at the tail part of the cold data area is directly deleted. Accurate elimination of data is realized through dynamic adjustment of the cold and hot data area, the proportion of hotspot data in cache is increased, the cache pollution problem is relieved, and the cache hit rate is increased.

Description

technical field [0001] The invention relates to the field of computer technology, in particular to a method for managing and replacing cached data. Background technique [0002] In a software system, there are a large number of data read operations. The data is stored in the database, the user sends a request, and the server processes it. It needs to go through three steps: establishing a database connection, executing SQL commands, and returning query results. This will bring additional resource consumption and increase system response time. In the actual business system, there are many repeated reading operations. If the data can be cached in the memory of the server, the next time the user requests, it can be directly read from the memory, which can avoid the overhead of connecting to the database and improve the system response. speed. [0003] Caching is of great significance to system optimization, but the memory space of the server is limited, and too much memory us...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F12/123G06F12/122
CPCG06F12/123G06F12/122G06F2212/1021G06F2212/163
Inventor 陈科明周雪梅张河东虞盛峥乔冠
Owner HANGZHOU DIANZI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products