Check patentability & draft patents in minutes with Patsnap Eureka AI!

Storage method and device for hybrid cache

A cache and data block technology, applied in the storage field, can solve the problem of inability to balance randomness and throughput, and achieve the effect of reducing table entry operations, improving CPU utilization, and improving command efficiency

Pending Publication Date: 2022-03-08
尧云科技(西安)有限公司
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Measuring the performance of a storage device often includes random concurrency and data throughput, but the current common practice is to use a block of uniform size for mapping, which is usually consistent with the data block provided by the operating system, such as 4K, 8K, 16k, 32k to 128k, under this mapping method, the FTL architecture is simple and easy to manage, but randomness and throughput cannot be considered

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Storage method and device for hybrid cache
  • Storage method and device for hybrid cache
  • Storage method and device for hybrid cache

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0029] The embodiment of the present disclosure discloses a hybrid cache storage method, including the following process:

[0030] Divide the data into blocks of preset size;

[0031] Associate the data blocks in the form of a data linked list and write them into the queue cache, and start reading and parsing the command cache after the device receives the command interruption;

[0032] After the storage device acquires the command, it reconstructs the data link list based on the command length;

[0033] The command reconstruction includes: when the data address is continuous, the logic circuit merges the small data blocks into the largest data block that can be supported, and uses the largest data block mapping, so that the command transmission efficiency can be doubled; when the data address is discrete or When it is a discrete command, the cache block mapping with the same size as the operating system data block is used;

[0034] A cache class flag is set in the command r...

Embodiment 2

[0044]The embodiment of the present disclosure discloses a storage device adopting a hybrid cache storage method. The storage device stores a computer program. When the storage device performs data storage operations, the aforementioned computer program is executed by a processor, which can implement the first embodiment. The storage method of the hybrid cache. First, the data is divided into data blocks of a preset size; the data blocks are associated in the form of a data linked list and written into the queue cache. After the device receives a command interrupt, it starts to read and parse the command cache.

[0045] After the storage device acquires the command, it reconstructs the data link list based on the command length; the command reconstruction includes: when the data addresses are continuous, the logic circuit merges the small data blocks into the largest data block that can be supported, and uses the largest data block Block mapping, which can double the command t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A storage method of a hybrid cache belongs to the technical field of storage, and is characterized by comprising the following steps: segmenting data into data blocks with preset sizes; associating the data blocks in a data chain table form and writing the data blocks into a queue cache; based on complex and irregular data commands of an operating system, the storage method can be used for dynamically caching and storing system write commands, and IOPS and sequential write performance of storage equipment can be considered; meanwhile, due to the fact that equipment caches are managed in a classified mode, the cache space can be utilized to the maximum extent, and memory fragments can be reduced to the maximum extent; the command efficiency is improved by reconstructing and transforming the continuous small data command, the table item operation carried out when the FTL is established by the CPU of the equipment is reduced, and the utilization rate of the CPU is improved; high concurrency of random writing and big data throughput of sequential writing are both considered, and the overall performance of the storage device can be remarkably improved.

Description

technical field [0001] The invention belongs to the technical field of storage, and in particular relates to a hybrid buffer storage method and device. Background technique [0002] As the performance requirements for data transmission and storage continue to increase, higher requirements are placed on the randomness of small data and the throughput performance of large data. [0003] At present, storage devices such as solid-state hard drives mainly use Nand Flash as the main storage medium. This medium has the ability to erase first and then write, and the erasure is based on Block, where the Block unit ranges from 4k * 128 to 16k * 2048 and other sizes. The medium also has the feature of using Page as the minimum writing unit, and the Page size also varies from 4k to 16k. However, the current common operating system reads and writes with 512Byte as the logical unit, so there must be a set of FTL mapping layer for logical address - physical address. [0004] Measuring th...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/06G06F9/50
CPCG06F3/0604G06F3/061G06F3/0622G06F3/064G06F3/0679G06F9/5027
Inventor 杨柱高明扬谷卫青唐先芝王剑立郝晨吴浚潘文洁刘艺楠马铭振
Owner 尧云科技(西安)有限公司
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More