Unlock instant, AI-driven research and patent intelligence for your innovation.

Method and apparatus for pre-reading in cache

A cache and read-ahead technology, applied in the computer field, can solve problems such as inability to perform pre-reading, performance degradation, and inability to pre-read

Inactive Publication Date: 2016-02-03
北海市云盛科技有限公司 +1
View PDF5 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, in streaming media applications, multiple files are read and written simultaneously in the same LV volume, resulting in multiple sequential IO streams for the entire LV volume.
Since the IO stream obtained after combining multiple sequential IO streams is not sequential, it cannot be pre-read, but needs to read data from the disk, reducing performance
[0005] In addition, considering that in the application of streaming media, non-sequential playback operations such as fast playback are used, the issued IO streams are regular, not conventional sequential IO streams, so pre-reading cannot be performed

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and apparatus for pre-reading in cache
  • Method and apparatus for pre-reading in cache

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0037] In order to make the object, technical solution and advantages of the present invention more clearly, the present invention will be further described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0038] In the embodiment of the present invention, the first IO stream and the second IO stream can be continuous to form a continuous IO stream, and then pre-read according to the remaining capacity of the CACHE block; the first IO stream and the second IO stream are not continuous, but their There is a regularity between them, the first IO flow and the second IO flow form the fourth IO flow, and then read ahead sequentially according to the remaining capacity of the CACHE block. In the cache, multiple IO streams are combined into multiple sequential IO streams as much as possible, so that pre-reading can be implemented in streaming media applications to improve the efficiency of pre-reading.

[0039] See attached figure 1 It is a s...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for pre-reading in a cache. The method comprises the steps that: a first IO stream received by the CACHE can form a continuous sequential IO stream, namely a third IO stream, with an existing sequential IO stream, namely, a second IO stream, the first IO stream is added into the second IO stream, and then sequential pre-reading is performed according to the length of the third IO stream and the remaining capacity of the CACHE block; and the first IO stream and the second IO stream cannot form the continuous sequential IO stream, the first IO stream is added into the second IO stream to form a fourth IO stream after the laws of the first IO stream and the second IO stream are determined, and then sequential pre-reading is performed according to the laws and the remaining capacity of the CACHE block. The invention furthermore discloses an apparatus for pre-reading in the cache. After embodiments of the invention are applied, the pre-reading can be realized in a stream media application, thereby improving the pre-reading efficiency.

Description

technical field [0001] The present application relates to the field of computer technology, and more specifically, to a method and device for pre-reading in a cache. Background technique [0002] Read-ahead means that the file system reads more file content than expected for the application at a time and caches it in the cache (CACHE), so that the next read request comes directly from the CACHE. Of course, this detail is transparent to the application, and the only feeling of running the application may be that the reading speed will be faster next time. [0003] At present, cache prereading is generally based on logical (LV) volumes. When the IO stream on the LV volume is sequential, prereading will be performed. In this way, when the front-end business IO command comes down, it does not need to access the disk, but directly Get data from the cache and return it, so the performance will be greatly improved. [0004] However, in streaming media applications, multiple files...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F12/0862
Inventor 庄建波
Owner 北海市云盛科技有限公司