Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Local data cache management method and device

A cache management, local data technology, applied in electrical digital data processing, memory system, memory address/allocation/relocation, etc., can solve the problem of inflexible storage of one-to-one correspondence between keywords and their values, and achieve fast data access The effect of high efficiency, high cache performance, and fast data access

Active Publication Date: 2014-03-26
TCL CORPORATION
View PDF5 Cites 21 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The embodiment of the present invention provides a local data cache management method and device, aiming to solve the lack of stability, performance, and developability of the existing local cache management method, and the one-to-one correspondence between keywords and their values ​​is not flexible enough question

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Local data cache management method and device
  • Local data cache management method and device
  • Local data cache management method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0022] figure 1 The implementation process of the local data cache management method provided by the first embodiment of the present invention is shown, and the details are as follows:

[0023] It should be noted that the present invention is especially suitable for running on a single server and the memory space is extremely limited.

[0024] In step S101, a continuous large block of memory is allocated from the memory, and the large block of memory space is divided into several small units of equal space size, so that the pointer points to the first free small unit.

[0025] In this embodiment, the large block of memory is a continuous memory space allocated in the memory, and its size is set according to the server environment. The larger the server memory, the larger the large block of memory. Preferably, the size of the large block of memory is in the range of 1-10Mb. Large blocks of memory are equivalent to slabs in Memcached. The small unit is several equal-sized sub...

Embodiment 2

[0041] image 3 The implementation process of the local data cache management method provided by the second embodiment of the present invention is shown, and the details are as follows:

[0042] In step S301, a continuous large block of memory is allocated from the memory, and the large block of memory space is divided into several small units of equal space size, so that the pointer points to the first free small unit.

[0043] In this embodiment, the execution of step S201 is similar to the execution process of step S101 in the above-mentioned first embodiment. For details, please refer to the description of the above-mentioned first embodiment.

[0044] In step S302, the data is received or read through a variable parameter function.

[0045] Specifically, since a large block of memory is controllable to apply for, and the boundaries of each small unit are also clear, when you know the data type, you also know the location and size occupied in the memory space, according t...

Embodiment 3

[0054] Figure 4 A specific structural block diagram of the local data cache management apparatus provided in Embodiment 3 of the present invention is shown. For convenience of description, only the parts related to the embodiment of the present invention are shown. In this embodiment, the local data cache management apparatus includes: an allocation unit 41 , a storage unit 42 , an index unit 43 , an elimination unit 44 , a space calculation unit 45 , a judgment unit 46 , a reading unit 47 and an interface unit 48 .

[0055] Wherein, the allocation unit 41 is used to allocate a continuous large block of memory from the memory, and divide the large block of memory space into several small units of equal space size, so that the pointer points to the first free small unit;

[0056] The storage unit 42 is used to fill the data into the free small cell pointed to by the pointer when there is data smaller than the size of the small cell that needs to be cached, and then point the p...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of computer cache management, and provides a local data cache management method and device. The method comprises the steps that a continuous large block of internal storage space is distributed from an internal storage, the large block of internal storage space is divided into a plurality of small units with the same space size, and a pointer is made to point to the first idle small unit; when data with the space size smaller than the space size of the small units need to be cached, the idle small unit which the pointer points to is filled with the data, and then the pointer is made to point to the next idle small unit; an index is obtained according to keywords in the data and the lengths of the keywords, the address of the data is added into a one-way chain table according to the index, and a head point of the one-way chain table is inserted into a hash table. According to the local data cache management method and device, the continuous internal storage space is distributed according to Memcached, the small units with the fixed size are used as the smallest cache units, the hash table is used for storing the chain table of the data address, the data are conveniently, flexibly and rapidly stored and retrieved, and the caching performance is high.

Description

technical field [0001] The invention belongs to the technical field of computer cache management, and in particular relates to a local data cache management method and device. Background technique [0002] With the rapid development of computers, the requirements for the speed of CPU access to data are getting higher and higher. Correspondingly, various caching technologies have appeared one after another and are widely used. [0003] Among them, Memcached is a high-performance distributed memory object caching system for dynamic web applications to reduce database load. It reduces the number of times the database is read by caching data and objects in memory, thereby improving the speed of the database-driven website. At the same time, Memcached considers factors such as distribution, clustering, and data network transmission, and designs the corresponding memory management methods and data. Structure, using continuous memory allocation, SLAB memory management method, hash...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F12/08G06F12/0871
Inventor 谭兰春
Owner TCL CORPORATION
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products