Self-adaption pre-reading method base on file system buffer

A file system and pre-reading technology, applied in memory systems, special data processing applications, memory address/allocation/relocation, etc., to achieve the effect of less historical information content, reduced reading, and simple calculation logic

Inactive Publication Date: 2008-07-30
ZTE CORP
View PDF0 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This kind of prefetching operation has no complicated algorithm design, and it is very efficient when reading continuous files, but when the files are fragmented, it is possible to preread too much garbage data

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Self-adaption pre-reading method base on file system buffer
  • Self-adaption pre-reading method base on file system buffer
  • Self-adaption pre-reading method base on file system buffer

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0025] The implementation of the present invention will be described in detail below in conjunction with the accompanying drawings and preferred embodiments.

[0026] The method of the present invention can adaptively adjust the pre-read data size based on the file system cache, and it includes the following basic processing steps:

[0027] Create a pre-read sector usage record;

[0028] Each time a pre-read operation is performed, record the number of pre-read sectors in the pre-read data usage record;

[0029] When using pre-read sectors, record the number of pre-read sectors used in the pre-read data usage record;

[0030] Calculate the usage rate of the previous pre-read sector according to the number of pre-read sectors and the usage of the pre-read sector, and determine the pre-read sector in the current pre-read operation according to the usage rate number.

[0031] The data structure of the pre-read sector use record of the present invention at least includes two fi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a self-adaptive pre-read method which is based on the caching of a file system. The method comprises the following steps that: firstly, a pre-read sector service record is established; secondly, each time the pre-read operation is executed, the pre-read sector number is recorded in the pre-read sector service record; thirdly, when the pre-read sector is used, the used pre-read sector number is recorded in the pre-read sector service record; fourthly, according to the prior pre-read sector number and the used pre-read sector number, the utilization rate of the prrior pre-read sector is calculated, and the pre-read sector number in the current pre-read operation is determined according to the utilization rate.

Description

technical field [0001] The invention relates to a method for reading and writing a disk, in particular to a pre-reading method based on a file system cache during the reading and writing process. Background technique [0002] Disk drive access is generally much slower than memory access, whether for read or write operations. For general device operations, when the user needs to perform read and write operations, the data is directly read and written to the disk. This disk operation method is very slow, so it is easy to block the application program. At the same time, because the magnetic head is not optimized, it needs to move the magnetic head frequently. Efficiency Also very low. The file system caching layer is designed to solve this problem. Its basic idea is: a data buffer is added between the upper user's disk operation and the actual device operation, and this buffer is used to optimize the disk operation performance. There are three main functions of the file syst...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F17/30G06F12/08G06F12/0862
Inventor 黄文伟童小九陆小飞周立超
Owner ZTE CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products