Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method and device for processing node cache data in distributed system

A distributed system and cache data technology, applied in the Internet field, can solve problems such as slow running speed, occupying large system disk resources, and spending a lot of time, and achieve the effects of optimizing processing methods, increasing loading speed, and reducing time

Active Publication Date: 2019-03-01
BEIJING QIHOO TECH CO LTD
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] However, when the node is restarted, if the cache is not stored persistently, the cache will be lost, and the loss of the cache will cause the system to run slower for a period of time, bringing a bad experience to the user
Therefore, in the prior art, in order to avoid cache loss, the cached data is generally stored persistently. However, when the amount of cached data is large, the persistent storage of the cached data will not only take a lot of time, but also take up a lot of time. system disk resources

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and device for processing node cache data in distributed system
  • Method and device for processing node cache data in distributed system
  • Method and device for processing node cache data in distributed system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0021] Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. Although exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited by the embodiments set forth herein. Rather, these embodiments are provided for more thorough understanding of the present disclosure and to fully convey the scope of the present disclosure to those skilled in the art.

[0022] figure 1 A schematic flowchart showing a method for processing node cached data in a distributed system according to an embodiment of the present invention, as shown in figure 1 As shown, the method includes the following steps:

[0023] Step S100, read at least one piece of cached data of the node, where the cached data is in the form of a data key-value pair.

[0024] In step S100, at least one piece of cached data of th...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a processing method and apparatus for node cache data in a distributed system. The method comprises: reading at least one cache datum of a node, wherein each cache datum is in the form of a data key value pair; for each cache datum, extracting a corresponding data key and performing persistent storage on the data key; and when the node is restarted, loading the data key into a cache of the node. According to the scheme, the time for performing the persistent storage on the cache data is greatly shortened, the occupation of system disk resources is reduced, and the loading speed is increased.

Description

technical field [0001] The invention relates to the technical field of the Internet, in particular to a method and device for processing node buffer data in a distributed system. Background technique [0002] The cache is a buffer for data exchange. When the system needs to read a piece of data in the database, it will first look up the piece of data from the cache. If the piece of data is found in the cache, it will be executed directly. If it is not found in the cache If you find this piece of data, then look it up from the database. Since the operation speed with the help of the cache is much faster than searching from the database, the cache can help the system run quickly and reduce the delay of the system. [0003] However, when the node is restarted, if the cache is not stored persistently, the cache will be lost, and the loss of the cache will cause the system to run slower for a period of time, bringing a bad experience to the user. Therefore, in the prior art, in...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F16/2455G06F16/2458
CPCG06F16/24552G06F16/2471
Inventor 许瑞亮陈宗志
Owner BEIJING QIHOO TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products