Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Cache acceleration management method, system and device for volumes in storage, and storage medium

A management method and management system technology, which are applied in the direction of electrical digital data processing, data processing input/output process, instruments, etc., can solve problems such as inability to cache acceleration, reducing management efficiency, and inability to filter out a cache partition.

Active Publication Date: 2018-11-20
ZHENGZHOU YUNHAI INFORMATION TECH CO LTD
View PDF7 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

When multiple volumes need to be accelerated in batches, because different volumes may belong to different IO groups or belong to different nodes in the same IO group, it is usually impossible to select a cache partition that can accelerate all the volumes to be accelerated, namely These volumes cannot be cache-accelerated with a cache partition
Therefore, for these batches of volumes to be accelerated, the solution in the prior art is to sequentially filter the cache partitions for each volume, and determine the cache partitions corresponding to each volume. Due to the large number of volumes, such a method can greatly reduce the management efficiency of bulk cache acceleration of volumes

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cache acceleration management method, system and device for volumes in storage, and storage medium
  • Cache acceleration management method, system and device for volumes in storage, and storage medium
  • Cache acceleration management method, system and device for volumes in storage, and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0045] The core of the present invention is to provide a cache acceleration management method for volumes in storage, which determines all volumes in the cache partition that can be managed by cache acceleration, and improves the management efficiency of batch cache acceleration of volumes.

[0046] In order to enable those skilled in the art to better understand the solution of the present invention, the present invention will be further described in detail below in conjunction with the accompanying drawings and specific embodiments. Apparently, the described embodiments are only some of the embodiments of the present invention, but not all of them. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0047] Please refer to figure 1 , figure 1 It is an implementation flowchart of a cache acceleration management metho...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a cache acceleration management method for volumes in a storage. The method comprises the steps of selecting a cache partition; acquiring data information of the cache partition, wherein the data information comprises a mode of the cache partition, a node ID of the cache partition and an IO group ID of the cache partition; traversing all the volumes in the storage, and determining a node ID of each volume and an IO group ID of each volume; according to the data information, the node ID of each volume and the IO group ID of each volume, determining a volume set serving as an acceleration object of the cache partition from all the traversed volumes; and sending acceleration starting instructions to any number of the volumes in the volume set so as to achieve cache acceleration of the volumes receiving the acceleration starting instructions. By applying the method, all the volumes, capable of being subjected to cache acceleration management, of the cache partitionare determined, so that the management efficiency of batch cache acceleration of the volumes is improved. The invention further discloses a cache acceleration management system and device for the volumes in the storage, and a storage medium, which have the corresponding technical effects.

Description

technical field [0001] The present invention relates to the technical field of storage, in particular to a cache acceleration management method, system, device and storage medium of volumes in storage. Background technique [0002] Volume, usually also called disk or partition, is the basic container unit for file management. According to its function, it can be divided into production volume, mirror volume, snapshot volume, clone volume, copy volume, copy volume, etc. [0003] In order to improve the performance of volumes on storage, cache partitions can be created to accelerate volumes, for example, creating cache partitions on SSDs (Solid State Drives) to accelerate volumes, and one cache partition can accelerate multiple volumes to accelerate. When creating a cache partition, you can use one or more nodes on the IO group to create it. For example, there are two nodes on a common IO group: node A and node B. You can create a cache partition through node A or node B alon...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F3/06
CPCG06F3/0611G06F3/0656G06F3/0665
Inventor 郭坤
Owner ZHENGZHOU YUNHAI INFORMATION TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products