Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Client caching method and system based on submodule optimization algorithm

An optimization algorithm and caching system technology, applied in file system, computing, resource allocation, etc., can solve problems such as difficult and efficient services, and achieve the effect of improving performance, optimizing storage and network communication performance

Pending Publication Date: 2020-05-19
江苏鸿程大数据技术与应用研究院有限公司
View PDF3 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Only relying on server-side caching is difficult to efficiently serve the current emerging application scenarios with frequent access to small-scale data
In addition, existing client caches are organized in units of fixed-size file blocks or objects, and there is currently no related technology and research work to solve the problem of variable-length file fragment caching with complex access patterns including coincidence, coverage, and intersection.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Client caching method and system based on submodule optimization algorithm
  • Client caching method and system based on submodule optimization algorithm
  • Client caching method and system based on submodule optimization algorithm

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0028] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0029] see Figure 1-5 , the present invention provides a technical solution:

[0030] A client caching method based on a submodule optimization algorithm. After the access unit input by the client, it is processed through the submodule optimization algorithm to determine whether the access unit needs to be cached, and the access units to be cached are assembled in the input to be cached. A batch input set is formed in the set, and the cache space is updated ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a client caching method and system based on a submodule optimization algorithm. The method comprises the following steps: after inputting an access unit by a client, sub-moduleoptimization algorithm processing is carried out to judge whether the access unit needs to be cached or not, an access unit set needing to be cached forms a batch input set in a to-be-cached input set. The caching space is updated according to the data of the batch input set, a cache management mechanism is set on the basis of a three-layer index management unit, a series of operation operators oriented to scenes such as file fragment overlapping, covering and crossing are formulated, and cache units in a complex access mode can be efficiently managed. According to the model, a cache problemis abstracted into a sub-module function optimization problem, the sub-module optimization algorithm is applied to a cache migration strategy, the model provides a synchronous / asynchronous cache replacement / lifting strategy for different application program running modes, and in addition, the model comprises multiple system optimization methods, so that the storage and network communication performance of a client cache is optimized.

Description

technical field [0001] The invention relates to the technical field of distributed file system caching, in particular to a client caching method and system based on a submodel optimization algorithm. Background technique [0002] The upper-layer application program of the distributed file system usually needs to obtain file metadata through RPC communication before using the client of the file system to perform I / O data transmission. However, in scenarios such as frequent random reading of repeated file blocks and repeated reading of small files, this will generate a large amount of RPC communication overhead, which will increase the overall file access time and seriously affect the reading performance. When the file system client corresponding to the application and the process that actually stores data in the file system are located on different nodes, the network overhead caused by remote reading will reduce I / O performance. Even if the data requested by the client proce...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F16/172G06F16/182G06F16/14G06F9/50
CPCG06F16/172G06F16/182G06F16/148G06F9/5016
Inventor 麦丞程
Owner 江苏鸿程大数据技术与应用研究院有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products