High-speed memory pre-read method and device

A memory and read-ahead technology, applied in the storage field, can solve the problems of inflexibility of fixed read-ahead technology, inability to read by the host, and less ability to read data in advance, and achieve the effect of improving system read performance

Active Publication Date: 2008-10-15
HUAWEI TECH CO LTD
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The disadvantage of the prior art 1 is that the fixed read-ahead technology is not flexible, and the data to be read is rarely read in advance, and the data that the host does not read is often fetched
[0008] The multiple read-ahead technology described in the prior art 2 has certain flexibility, but it cannot really read the data to be read in advance

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • High-speed memory pre-read method and device
  • High-speed memory pre-read method and device
  • High-speed memory pre-read method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0042] The core idea of ​​the present invention is to provide a high-speed memory pre-reading method and device. For various data access modes, the corresponding pre-reading information is preset, and the pre-reading operation is performed according to the preset pre-reading information, and dynamically Adjust the read-ahead data size to improve the cache read hit rate and read cache operation efficiency.

[0043] The cache read-ahead operation is mainly to pre-read the cache according to the current read request when reading the cache, and pre-read the data required for the next read command into the cache in advance, so as to improve the read performance and read efficiency of the cache.

[0044] The present invention provides a high-speed memory pre-reading device, the module schematic diagram of an embodiment of the device is as follows figure 2 As shown, it includes: a pre-reading information storage module and a pre-reading module.

[0045] The pre-reading information ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

This invention relates to one high speed memory pre-read method and device in memory technique, wherein the method comprises the following steps: in read operation, according current business type to compare pre-set information and determining pre-set data size and processing data. This invention aims at each certain business optimal data size of simple realization without computation or extra memory space for pre-read data.

Description

technical field [0001] The invention relates to the field of storage technology, in particular to a high-speed memory pre-reading method and device. Background technique [0002] With the rapid development of electronic technology, the performance of various processors doubles about every 2.25 years. In contrast, the disk, as the main online storage device of the computer system, although its capacity doubles every three years, However, the reading and writing speed has not improved much. As a result, the mismatch between disk and processor speeds has become increasingly prominent. [0003] Cache (high-speed memory) technology is an effective method to improve disk access speed. The location of Cache in the storage subsystem is as follows figure 1 shown. When reading the Cache, there may be a read hit or a read miss. When the read cache hits, the data is returned directly from the cache to the host. When the read cache misses, it is necessary to read the disk. The corr...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06F12/08G06F12/0862
Inventor 张粤熊建刚
Owner HUAWEI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products