Data caching method

A data cache and data technology, applied in the direction of electrical digital data processing, memory systems, instruments, etc., can solve the problems of easy memory jitter, memory reduction, low efficiency, etc., to achieve the effect of avoiding memory jitter, improving efficiency and speed

Inactive Publication Date: 2016-01-13
RUN TECH CO LTD BEIJING
View PDF5 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The above scheme, because the data needs to be compared with all the data in the memory, is therefore inefficient, and due to the regular cleaning of the memory, the memory will suddenly decrease, and when the amount of data to be accessed is large, the memory will suddenly increase, so lead to memory jitter

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Data caching method
  • Data caching method
  • Data caching method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0025] figure 1 It is a flow chart of the data caching method provided in Embodiment 1 of the present invention. This embodiment is applicable to any data caching situation. The method can be executed by a data processing device configured on a terminal, and specifically includes the following steps:

[0026] S110. Initialize the memory to establish a first linked list, a hash table and a second linked list, the nodes of the first linked list are used to store the storage addresses of each storage unit in the memory, and the nodes of the hash table are used to store data hash value, the nodes of the second linked list are used to store the storage address of the corresponding storage unit according to the storage order of the data;

[0027] Wherein, the storage order of the storage addresses in the nodes of the second linked list can be read-in time order, the first read-in is stored in the head node of the second linked list, and other nodes are analogously stored in the stor...

Embodiment 2

[0039] figure 2 It is a flow chart of the data caching method provided in Embodiment 2 of the present invention. This embodiment is based on the foregoing embodiments, and further provides a specific implementation of data caching, that is, step 130 in the data caching process specifically includes:

[0040] S131. Determine whether there is a storage address of a free storage unit in the first linked list, if yes, perform a storage operation, and if not, obtain the head node of the second linked list;

[0041] Wherein, the head node of the second linked list stores the memory address where the stored data with the relatively earliest read-in time is located.

[0042] S132. Empty the storage address in the head node, and delete the hash value of the stored data corresponding to the emptied storage address from the hash table;

[0043] S133. Store the data in the storage unit corresponding to the emptied storage address, and store the hash value of the data in the hash table; ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a data caching method, which comprises the following steps: initializing an internal memory to build a first chain table, a hash table and a second chain table; when receiving access data, calculating hash values of the data, and comparing the hash values with the hash values in the hash table; if no consistent hash value exists, obtaining a memory address of a free memory cell from the first chain table, storing the data into the free memory cell, sequentially storing the memory address of the free memory cell into the second chain table, and storing the hash values of the data into the hash table; and if the consistent hash value exists, stopping a writing operation of the access data on the internal memory. Through the arrangement of the first chain table and the second chain table, the earliest storeddata are deleted in a circulating manner; the shaking phenomenon of the internal memory is avoided; and when the access data are stored, only the hash values of the data are compared to improve the comparison efficiency and speed.

Description

technical field [0001] Embodiments of the present invention relate to data storage technologies, and in particular, to a data caching method. Background technique [0002] Existing computers, in order to increase the speed of data access, need to write the data to be accessed into the memory, and then access it. The memory space is limited, so it is necessary to regularly clean up expired data to provide sufficient memory space. [0003] The caching method in the prior art is, when receiving the data to be accessed, first compare it with the data stored in the existing memory, if the data already exists, discard the data, and directly access the existing data in the memory , if not, write the data into memory. Scan memory data at regular intervals, and delete expired data whose storage time exceeds the set value. The above cache usage scheme can be applied to various scenarios, such as traffic analysis and statistics, that is, the data to be accessed is traffic, and when ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F12/0871
Inventor 尧津来
Owner RUN TECH CO LTD BEIJING
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products