Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

General caching method

A cache and cache segment technology, applied in the field of data processing, can solve problems such as large system resource consumption, deviation between cached data and real data, and high cache synchronization cost, achieving high read performance, clear management, and high flexibility. Effect

Inactive Publication Date: 2008-02-27
ZTE CORP
View PDF0 Cites 32 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] At present, most of the existing cache systems use the LRU (least recently used) algorithm. The LRU algorithm is a relatively simple elimination selection algorithm, which cannot truly reflect the usage of the content in the cache.
[0004] At present, some existing cache systems do not provide cache synchronization function in order to achieve simplicity, which will lead to deviations between cached data and real data
Although some cache systems provide the cache synchronization function, the cost of cache synchronization is too high because of the consumption of a large amount of system resources.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • General caching method
  • General caching method

Examples

Experimental program
Comparison scheme
Effect test

no. 1 example

[0120] The currently requested data is not in the specified cache segment and the corresponding cache segment has enough free space to store the requested data.

[0121] When the system receives a new user request content, the system judges that the data is not in the specified cache segment, the system allocates a row-level lock for the requested data key, and then obtains the requested data from an external data source, and calls the segment-level lock to Add this content to the corresponding cache segment in the cache system, release the added lock, and then return this content to the user.

no. 2 example

[0123] When the requested data is already in the cache segment, the cache system fetches the requested data and outputs it back to the user.

no. 3 example

[0125] When the requested data is not in the cache segment and the cache segment does not have enough remaining space, the cache system allocates a row-level lock for the requested data key, and then obtains the requested data from the external data source through the acquisition external data interface, and through The touch count algorithm releases enough space, adds the requested data to the corresponding cache segment, releases the added lock, and finally outputs the requested data to the user.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

This invention advances a kind of general slow storage method which is used to fetch the storied data in the slow storage. In that includes: step one, in the slow storage sets up an interface used to fetch the external data connecting with the external data source; step two, separate the slow storage area into multiple slow storage sector and storage one data only in one slow storage sector and then locate the names for every slow storage sector and the data main key value; step three, when the slow storage system receiving the data access request call on the corresponding slow storage sector according to the request access slow storage sector's name and data main key value and then judge the slow storage sector whether storied the request data if it had storied it would output the request data and if not it would fetch the request data from the external data source through the interface and preserve the request data at the slow storage sector and then output it.

Description

technical field [0001] The invention relates to a data processing method, in particular to a data caching method. Background technique [0002] At present, in the application of business software systems, the requirements for system processing performance are getting higher and higher, so frequent database operations and file reading and writing should be avoided. In order to solve the problem of inefficient data reading, the use of caching technology must be considered. Cache is to store certain data in the memory. When the system needs to access this data, it can directly read it from the memory, which reduces the time-consuming of database operations and file reading and writing. In a real environment, for a large amount of data, it is impossible for the system to cache all the data in the memory, so it is necessary to use a certain strategy to eliminate cache items, and the cache needs to update it when its cache items change. . [0003] At present, most of the existi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F12/08G06F12/0868G06F12/121
Inventor 唐鲲鹏吕吉单良
Owner ZTE CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products