Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Cache management method of single-carrier multi-target cache system

A cache system and cache management technology, applied in memory systems, input/output to record carriers, electrical digital data processing, etc., can solve the problems of wasting cache space and different disk frequency, and achieve the effect of good IO performance

Active Publication Date: 2014-07-23
JIANGSU DAWN INFORMATION TECH CO LTD
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] In order to solve the waste of a large amount of cache space caused by the different frequency of disk use, the present invention provides a cache management method for a single-carrier multi-object cache system

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cache management method of single-carrier multi-target cache system
  • Cache management method of single-carrier multi-target cache system
  • Cache management method of single-carrier multi-target cache system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0021] In the implementation process, it is equivalent to artificially dividing the only cache device into multiple shares (determine the number of shares and the size of each share according to the number and capacity of the disk devices to be cached), and then each disk device corresponds to one of them , the system saves all the mapping relationships. When receiving an IO request, first detect which disk device the IO operation belongs to; according to the mapping relationship between the disk device and the cache device, the offset address of the corresponding cache device can be found, since the size of the cache used by it has been determined, Therefore, the address information of this IO can be easily mapped to the address space of the cache device according to the rules of group connection; and then further read and write operations can be performed according to the address information of the cache device.

[0022] The whole process is as Figure 4 shown.

[0023] Wh...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a cache management method of a single-carrier multi-target cache system. The cache management method comprises the following steps: requesting for allocating a new cache block to store data when a read-write operation of a user is missed in a corresponding cache device; directly allocating if an idle buffer block exists in an available cache address space; executing the operation for adding 1 to the insufficiency frequency of the cache block if no idle buffer block exists; judging whether the frequency is more than a preset threshold, and returning to execute cache replacement operation if the frequency is not more than the preset threshold; checking information of the cache block of other disks if the frequency is more than the preset threshold; returning to execute the cache replacement operation if other disks have no idle cache block; writing all the cache data back into respective hard disks; and reallocating the space of the cache block according to the service condition. In the invention, an independent cache space is allocated to each disk, then set-associative mapping is respectively carried out on each disk, and the cache device can be automatically reallocated when the system detects that a certain disk requires more or less cache space, thus the whole system can obtain better IO (input output) performance.

Description

technical field [0001] The invention relates to the field of storage system cache management, in particular to a method for dynamically adjusting the sharing ratio of multiple target disks sharing a single cache device to ensure balanced performance improvement. Background technique [0002] The disk has always occupied a place in the computer system with its advantages of large capacity and low price, but its internal mechanical components limit the further improvement of the speed, especially for discontinuous IO data access, the performance is far behind The development of memory. Therefore, the disk access speed has always been the bottleneck of IO-intensive applications. In order to improve the read and write performance of the disk, disk manufacturers have added a cache (Cache memory) inside the disk. Cache is a memory chip on the hard disk controller with extremely fast access speed. It is the interface between the internal storage of the hard disk and the external i...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F12/08G06F3/06G06F12/0871
Inventor 袁清波邵宗有刘新春
Owner JIANGSU DAWN INFORMATION TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products