Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A memory cache management method, system, storage medium and electronic device

A technology of memory caching and management methods, applied in the field of computer data processing, can solve problems such as poor user experience, slow system response, waste of resources, etc., achieve the effects of reducing recycling and allocation frequency, optimizing user experience, and improving response time

Active Publication Date: 2022-03-29
智器云南京信息科技有限公司
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, in the case of limited resources, when multiple users operate a large amount of data, the memory cache resources will not be enough, and the system will continue to reclaim and reallocate memory resources, resulting in slow system response and poor user experience.
[0003] Since each user has different processing frequency and time for the same batch of data, a unified memory cache solution treats data and usage without distinction, and performs unified memory management, which will greatly waste resources

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A memory cache management method, system, storage medium and electronic device
  • A memory cache management method, system, storage medium and electronic device
  • A memory cache management method, system, storage medium and electronic device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0048] In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the specific implementation manners of the present invention will be described below with reference to the accompanying drawings. Apparently, the accompanying drawings in the following description are only some embodiments of the present invention, and those skilled in the art can obtain other accompanying drawings based on these drawings and obtain other implementations.

[0049] In order to make the drawings concise, each drawing only schematically shows the parts related to the present invention, and they do not represent the actual structure of the product. In addition, to make the drawings concise and easy to understand, in some drawings, only one of the components having the same structure or function is schematically shown, or only one of them is marked. Herein, "a" not only means "only one", but also means "more than one".

[0050] An embod...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention provides a memory cache management method, system, storage medium and electronic equipment. The method includes: when a data analysis request is received, analyze and obtain a set of data feature values ​​and a set of behavior feature values; query the cache record; if there is no In the cache record, and the requested data needs to be cached, query the historical cache time; when there is a historical cache time, obtain the historical cache life cycle; calculate the time interval between this access time and the latest access time; if the historical cache life cycle is less than or equal to Time interval, use the time interval as the new cache life cycle; if at least one historical cache life cycle is greater than the time interval, calculate the new cache life cycle based on the historical cache life cycle and time interval. The present invention adopts a dynamically optimized memory cache management method, thereby optimizing resource allocation strategies, reducing resource recovery and allocation frequency, improving system response time, and optimizing user experience.

Description

technical field [0001] The invention relates to the technical field of computer data processing, in particular to a memory cache management method, system, storage medium and electronic equipment. Background technique [0002] Memory cache is a solution used by common computer software to improve the response speed. Generally, the management mechanism of memory cache is fixed. However, in the case of limited resources, when multiple users operate a large amount of data, the memory cache resources will not be enough, and the system will continue to reclaim and reallocate memory resources, resulting in slow system response and poor user experience. [0003] Since each user processes the same batch of data at different frequencies and times, a unified memory cache solution treats data and usage without distinction, and performs unified memory management, which will greatly waste resources. Contents of the invention [0004] The purpose of the present invention is to provide ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F12/02
CPCG06F12/0253
Inventor 王海波何涛
Owner 智器云南京信息科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products