Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Device and method for asynchronous pre-reading of small files in parallel network file system

A network file system and small file technology, applied in instruments, computing, electrical and digital data processing, etc., can solve problems such as rigidity, reduce waiting delays, improve cache hit rate, and have a wide range of applications.

Inactive Publication Date: 2018-10-09
INST OF COMPUTING TECH CHINESE ACAD OF SCI +1
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] Asynchronous pre-reading is an intuitive way to solve this problem, but the existing asynchronous pre-reading technology mainly adopts a "single-point drive method", and the trigger point is fixed in a single data page. The advantage of this is that it is simple and easy to implement. But the disadvantage is that it is too rigid. Once the page is not accessed, it will not trigger prefetching even if it has good access locality. Even if it does not have good locality, it will trigger prefetching if it happens to visit the page
[0008] The "hit rate-driven read-ahead" in the prior art is easily confused with the "usage-driven read-ahead" in the present invention, such as image 3 As shown, there is a clear difference between the hit rate and the usage rate: the hit rate refers to the ratio of the number of cache hits to the number of accesses

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Device and method for asynchronous pre-reading of small files in parallel network file system
  • Device and method for asynchronous pre-reading of small files in parallel network file system
  • Device and method for asynchronous pre-reading of small files in parallel network file system

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment

[0064] The following are specific embodiments of the present invention, as follows:

[0065] The device of the present invention includes: a client module, a server module, and a data storage module;

[0066] The client module is used for the client to send an operation request to the server, and receive the client extended read-only directory authorization granted by the server, and insert the file directory into the end of the corresponding directory list after obtaining the client extended read-only directory authorization , and put the page associated with the file directory into the page cache, and the client reads the data of the file directory from the page cache, if the data of the file directory is not in the page cache hit, then check whether there is a corresponding anonymous page in the anonymous page linked list, if hit, then return the anonymous page to the upper application, if not hit, then initiate a synchronous disk read operation, wherein the remaining anony...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a parallel network file system, in particular to a small file asynchronous pre-reading device and method for a parallel network file system. The device comprises a client module and a server module, wherein the client module is used for receiving client extension read-only directory authorization authorized by a server, pages associated with a file directory are put in a page cache, a client reads data of the file directory from the page cache, if the data of the file directory are missed in the page cache, whether corresponding anonymous pages exist in an anonymous page chain table is checked, if the anonymous pages exist in the anonymous page chain table, the anonymous pages are returned back to an upper application, and otherwise, a synchronous disk reading operation is initiated, wherein when the specific value of the number of remaining pages in the anonymous page chain table to the number of pages to which a disk has access at a time is lower than a set threshold value, an asynchronous pre-reading operation in the anonymous page chain table is triggered; the server module is used for obtaining operation requests and authorizing the client extension read-only directory authorization to the client module.

Description

technical field [0001] The invention relates to a data pre-reading mechanism in a parallel network file system, in particular to an asynchronous pre-reading device and method for small files in a parallel network file system. Background technique [0002] With the advent of the era of big data, the amount of global data information has grown rapidly, and there are more and more small-sized files in the fields of e-commerce, social networking, and scientific computing. Alibaba Group, which created the largest IPO on the New York Stock Exchange, Its important e-commerce trading platform - Taobao, has saved more than 20 billion pictures, with an average size of only 15KB. In a sense, there is no purely large file application at present. The Age of Documentation". [0003] Parallel Network File System (pNFS) adopts the separation structure of metadata and data services. The client directly accesses the storage device through the out-of-band access mode. It has excellent perform...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F17/30
Inventor 杨洪章张军伟曾祥超韩晓明
Owner INST OF COMPUTING TECH CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products