Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A data caching method of a distributed storage system

A technology of distributed storage and data caching, applied in the direction of input/output to the record carrier, etc., can solve the problems of inaccuracy and inability to solve, and achieve the effect of improving service quality, optimizing use, and avoiding cache pollution.

Active Publication Date: 2019-06-28
SHENZHEN POWER SUPPLY BUREAU +1
View PDF8 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This solution emphasizes the allocation of different cache partitions for different applications, that is, the allocation of different data blocks for different applications, rather than defining the cache strategy and making changes to the flushing strategy. In a distributed storage system, data The application data may be split and stored on multiple hard disks of multiple servers, and the strategy of selecting different data block sizes for the application model may not be accurate, and it cannot solve some types of data in the distributed storage system ( Such as redundant data, or copy data) causing cache pollution problems

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A data caching method of a distributed storage system
  • A data caching method of a distributed storage system
  • A data caching method of a distributed storage system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0036] The following descriptions of various embodiments refer to the accompanying drawings to illustrate specific embodiments in which the present invention can be implemented.

[0037] In a storage system, in order to speed up performance, it is often necessary to use a cache technology, which widely exists in various computer systems, such as between the computer CPU and memory, between the memory and an external hard disk, and so on. The capacity of the cache is generally small, but the speed is higher than that of the low-speed device. Setting the cache in the system can increase the data read and write speed of the low-speed device and improve the performance of the entire system.

[0038] Because the capacity of the cache is much smaller than that of the low-speed device, it is necessary to swap in and out the data in the cache. Taking HDD as a low-speed device and a small amount of SSD as a cache as an example, the data read from HDD and stored in SSD is the same as th...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a data caching method for a distributed storage system, and the method comprises the steps: classifying data which is issued to each cache unit in the distributed storage systemand requires to be cached according to different dimensions, defining a label of each data type, and enabling the data to carry the label; Identifying a label of data requested to be stored, and obtaining a data type of the data requested to be stored; And caching the data according to the data type and a preset storage strategy. According to the method, the disk refreshing efficiency can be greatly improved, the use of cache resources is optimized, cache pollution is avoided, and the service quality of the whole distributed storage system is improved through the cache technology.

Description

technical field [0001] The invention relates to the field of distributed storage, in particular to a system data cache method of a distributed storage system. Background technique [0002] The patent with the publication number: CN103279429A discloses an application-aware distributed global shared cache partition method. Based on the application, the cache resources are partitioned and managed. Each independent cache partition selects the appropriate data block size according to the application load characteristics to improve the cache. Resource utilization and hit rate, and the cache partition can reclaim cache resources through the application-aware cache recovery strategy during system operation, so as to realize application-level cache differentiated services, allocate more cache resources to key applications, and allocate them on demand The cache allocation mechanism combined with priority reclamation makes it possible to use different cache partition sizes for differen...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/06
Inventor 冷迪黄建华陈瑞吕志宁庞宁花瑞邱尚高文刘飞
Owner SHENZHEN POWER SUPPLY BUREAU
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products