Multi-entry data caching method and storage medium

A data caching, multi-entry technology, which is applied in electrical digital data processing, special data processing applications, digital data information retrieval, etc. It can solve problems such as reducing code intrusion, improving efficiency, and reducing access pressure.

Active Publication Date: 2020-05-01
FUJIAN YIRONG INFORMATION TECH +2
View PDF9 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

At present, there are many cache systems including mencached, redis, ehcache, etc. The access and use of various cache systems need to be implemented according to different requirements. Without a unified method, it is difficult to quickly access and manage the cache systems
[0003] At present, the widely used cache processing methods include directly calling the SDK of various cache systems for direct cache operations, and the method of using AOP for cache processing. Among them, the method based on AOP cache processing can reduce code intrusion, but it still cannot solve the problem. The problem of fast switching between various cache systems
[0004] There are still some cache service implementation methods in the existing technology, which divide the cache into private cache and public cache, and set a prefix for the cache key. At the same time, the framework provides a set of specifications to manage the cache key to prevent cache key conflicts , this method is mainly used to solve the permission access problem of public cache and private cache, and it cannot solve the fast switching of cache system and code intrusion. Among them, the original data is directly stored in the cache system, and there will be data abnormality.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-entry data caching method and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0020] In order to explain in detail the technical content, structural features, achieved goals and effects of the technical solution, the following will be described in detail in conjunction with specific embodiments and accompanying drawings.

[0021] see figure 1 , is the flowchart of the scheme of the present invention, and concrete working principle is as follows:

[0022] A multi-entry data caching method, comprising the steps of intercepting an original cache call request, and generating a key KEY according to the cache call request,

[0023] The generation rule of the key KEY is: entry platform code+namespace+class name+namespace version number+identification code;

[0024] Use the above key to call the data in the cache server. If the data is obtained, the data will be returned. If the data is not hit, the original cache call request will be used directly to obtain the data in the database, and the data and the corresponding key KEY will be stored. in the cache serv...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a multi-entry data caching method and a storage medium, and the method comprises the following steps: intercepting an original cache call request, and generating a secret key KEY according to the cache call request, with a generation rule of the secret key KEY being entry platform code + namespace + class name + namespace version number + identification code; and using thekey for carrying out data calling in a cache server; if data is obtained, returning the data, and if data is not hit, directly using the original cache calling request for obtaining the data in the database, and storing the data and the corresponding key KEY in the cache server. The data calling request is intercepted, and then an identification KEY and the corresponding data are created in the cache server, so that the request of calling the same data by multiple entrances for multiple times does not need to access the database for multiple times, the code invasion of the database is reduced,and the entrance data calling efficiency is further improved.

Description

technical field [0001] The invention relates to the field of data storage, in particular to a data caching method for accelerating multi-platform intercommunication. Background technique [0002] Caching technology is a common technology to speed up data reading, and it is often used in various software systems to help the system run faster. At present, there are many cache systems including mencached, redis, ehcache, etc. The access and use of various cache systems need to be implemented according to different requirements. Without a unified method, it is difficult to quickly access and manage the cache systems. [0003] At present, the widely used cache processing methods include directly calling the SDK of various cache systems for direct cache operations, and the method of using AOP for cache processing. Among them, the method based on AOP cache processing can reduce code intrusion, but it still cannot solve the problem. The problem of fast switching between various cac...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F16/2455G06F16/2453
CPCG06F16/24552G06F16/2453
Inventor 郑耀松苏江文庄莉梁懿王秋琳
Owner FUJIAN YIRONG INFORMATION TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products