Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Cache data processing method, server and configuration device

A cache data and cache processing technology, which is applied in the direction of electronic digital data processing, memory system, memory address/allocation/relocation, etc., can solve the problem of reducing the performance of the server system, unable to meet the dynamic adjustment of the cache processing strategy, and unable to dynamically adjust in real time Adjust the elimination algorithm and other issues to achieve the effect of dynamic cache processing strategy

Active Publication Date: 2016-10-12
TENCENT TECH (SHENZHEN) CO LTD
View PDF10 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the existing LRU elimination algorithm exists in the code in a fixed form, and it is impossible to dynamically adjust the cache elimination algorithm in real time; that is, the existing cache processing strategy adopts the same fixed elimination strategy for all cached data , therefore, it cannot meet the requirement of dynamically adjusting the cache processing strategy, which reduces the performance of the server system

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cache data processing method, server and configuration device
  • Cache data processing method, server and configuration device
  • Cache data processing method, server and configuration device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0047] figure 1 It is a schematic diagram of the implementation flow of the cache data processing method in the embodiment of the present invention Figure 1 ;Such as figure 1 As shown, the method includes:

[0048] Step 101: the server acquires usage characteristic information of historical cache data;

[0049] In this embodiment, the usage feature information includes at least the usage frequency of the historical cache data; further, the usage feature information may also include a set priority for the historical cache data. Of course, those skilled in the art should know that in practical applications, the usage feature information can be set arbitrarily according to actual needs.

[0050] In practical applications, the step of obtaining the usage characteristic information of the historical cache data may specifically be:

[0051] The server receives the service request sent by the terminal, counts the calling feature information of the historical cache data called by...

Embodiment 2

[0070] Based on the method described in Embodiment 1, in this embodiment, in order to avoid the process of the server determining the target cache processing strategy from causing an excessive load to itself, the server may selectively formulate a cache processing strategy for only part of the cached data; specifically ground, such as Figure 4 As shown, the server classifies the historical cache data into the first type of cache data and the second type of cache data based on the usage frequency in the usage feature information of the historical cache data, wherein the first type of cache data The frequency of use of the cached data of the second type is higher than the frequency of use of the second type of cached data; for example, the frequency of use of the first type of cached data is higher than the preset frequency, and the frequency of use of the second type of cached data is lower than the preset frequency; Furthermore, the server generates a target cache processing ...

Embodiment 3

[0076] Figure 5 It is the third schematic diagram of the implementation flow of the caching data processing method in the embodiment of the present invention; as Figure 5 As shown, the method includes:

[0077] Step 501: the server acquires usage characteristic information of historical cache data;

[0078] In this embodiment, the usage feature information includes at least the usage frequency of the historical cache data; further, the usage feature information may also include a set priority for the historical cache data. Of course, those skilled in the art should know that in practical applications, the usage feature information can be set arbitrarily according to actual needs.

[0079] In practical applications, the step of obtaining the usage characteristic information of the historical cache data may specifically be:

[0080] The server receives the service request sent by the terminal, counts the calling feature information of the historical cache data called by the...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention discloses a cache data processing method. The cache data processing method comprises the steps that using feature information of historical cache data is acquired through a server, wherein the using feature information at least comprises the using frequency of the historical cache data; the server determines a target cache processing strategy for the historical cache data according to the using feature information of the historical cache data, wherein the target cache processing strategy is at least used for updating at least part of the cache data in the historical cache data; the historical cache data is processed at least through the target cache processing strategy for the historical cache data, and then the historical cache data corresponding to the server is updated. The embodiment of the invention further discloses the server and configuration equipment.

Description

technical field [0001] The invention relates to cache technology, in particular to a cache data processing method, server, and configuration equipment. Background technique [0002] At present, in Internet application scenarios, for scenarios with more reads and fewer writes, caching and back-end services are often used to improve the response performance and throughput of the server system, and at the same time, the back-end services can also be protected to a certain extent through caching unit. However, limited to the bottleneck of memory resources, for the local cache, the Least Recently Used (LRU, Least Recently Used) elimination algorithm is often used to ensure that the data volume of cached data will not increase infinitely. Moreover, the LRU elimination algorithm can ensure the maximum access frequency Requests are cached in memory as much as possible. However, the existing LRU elimination algorithm exists in the code in a fixed form, and it is impossible to dynam...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F12/08
CPCG06F12/0871
Inventor 王佳
Owner TENCENT TECH (SHENZHEN) CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products