Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Dynamic cache processing method and device, storage medium and electronic equipment

A technology of dynamic caching and processing methods, applied in the field of data processing, can solve problems such as data processing efficiency decline, achieve the effects of improving user experience, saving cache resources, and improving accuracy

Pending Publication Date: 2020-01-14
CHINA PING AN PROPERTY INSURANCE CO LTD
View PDF0 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The purpose of the present disclosure is to provide a dynamic cache processing method, device, storage medium and electronic equipment, and then at least to a certain extent overcome the problem of decreased data processing efficiency due to cache resources being occupied by useless data

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Dynamic cache processing method and device, storage medium and electronic equipment
  • Dynamic cache processing method and device, storage medium and electronic equipment
  • Dynamic cache processing method and device, storage medium and electronic equipment

Examples

Experimental program
Comparison scheme
Effect test

example 1

[0066] Example 1: The writing time of data A is 9:21:05, and the current time is 12:21:05. The difference Δt between the writing time of data A and the current time is 3 hours. will parameter a 1 Configured as 10, the parameter b 1Configured as 1, the first score s is calculated by formula 2 1 is 2.5.

[0067] In another exemplary implementation, in step S120, determining the first fraction of the data based on the writing time of the data may include steps S310-S330. see image 3 said, where:

[0068] In step S310, calculate the time difference between the writing time of the data and the current time;

[0069] In step S320, the time difference between the writing time of the data and the current time is stored in the first score relationship table in advance corresponding to the first score;

[0070] In step S330, the first score relationship table is queried to determine the first score corresponding to the time difference between the writing time and the current tim...

example 2

[0074] Example 2: The current time is May 15, 2019 13:30:21. Data B was written to the cache at 13:21:21 on May 15, 2019. Then the time difference between the writing time of data B and the current time is 9 minutes, and the first score of data B is 40 by querying the first score relationship table (see below). Data C was written to the cache at 13:30:21 on February 15, 2019. Then the time difference between the writing time of data C and the current time is 3 months. By querying the first score relationship table (see below), the first score of data C can be obtained as 0.5.

[0075] The first score relationship table is as follows:

[0076]

[0077] In step S122, a second score of the data is determined based on the writer information of the data. After investigating the experience of using data by a large number of users, it shows that the data written by some users is used more frequently, and the data written by this user is of high importance. Conversely, the data...

example 3

[0086] Example 3: In the company, the relevant data released by the director will be accessed by most managers and employees of all departments, and the number of visits is high, and its importance is high. The relevant data released by the manager will be accessed by the personnel in the department, and the number of visits is relatively high, and its importance is relatively high. The relevant data recorded by employees will be accessed by themselves or their managers, and rarely accessed by other employees. The number of accesses is small and its importance is low. Therefore, the director can be identified as the third level, the manager can be identified as the second level, and the employee can be identified as the first level. Assume that the value of the parameter R is configured as 3, a 2 Configured to 0.5. Calculated by formula 3: the second score s of the relevant data written by the director 2 At 13.5, the second fraction of relevant data written by the manager s...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a dynamic cache processing method and device, a storage medium and electronic equipment, and belongs to the technical field of information processing. The method comprises thesteps of obtaining writing time, writer information and historical access information when data in a cache is written into the cache; determining a first score of the data based on the write time of the data, determining a second score of the data based on the writer information of the data, and determining a third score of the data based on the historical access information of the data; determining an importance score of the data based on the first score, the second score and the third score of the data; and deleting the data of which the importance score is lower than a preset threshold in the cache according to the importance score of the data. According to the method, the data with the low importance degree in the cache is deleted in a targeted mode, the accuracy of clearing the data in the cache is improved, the data processing efficiency is improved, and the user experience is improved.

Description

technical field [0001] The present disclosure relates to the technical field of data processing, and in particular, to a dynamic cache processing method, device, storage medium, and electronic equipment. Background technique [0002] The cache is a buffer storage area used to improve the processor to read data quickly, and the cache capacity is very limited. With the development of computer technology, especially with the upgrade of system software, the increase of various application software and its functions, and the enrichment of content, the amount of data that processors need to process is increasing, and the cache capacity cannot meet the huge data exchange. need. In related technologies, the method of deleting the first written data is used to clear the cache, but it will cause the data that some users often use to be deleted together, causing important data to be lost, but useless data is still occupied in the cache, reducing data loss. Processing efficiency and u...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F16/2455G06Q10/06
CPCG06F16/24552G06Q10/06393Y02D10/00
Inventor 林勇
Owner CHINA PING AN PROPERTY INSURANCE CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products