System and method of busy-wait after pre-reading small file in parallel network file system

A network file system and small file technology, applied in the system field, to achieve the effect of improving read access performance

Inactive Publication Date: 2015-11-25
INST OF COMPUTING TECH CHINESE ACAD OF SCI +1
View PDF6 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] In view of the actual needs and the lack of current related research, it is necessary to propose a system and method for pre-reading data between files in a parallel network file system and then reading busy waiting, so as to effectively hide the waiting delay of locked pages and fundamentally overcome The drawbacks of the original technology "idle and wait" will have a very significant positive impact on system performance

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • System and method of busy-wait after pre-reading small file in parallel network file system
  • System and method of busy-wait after pre-reading small file in parallel network file system
  • System and method of busy-wait after pre-reading small file in parallel network file system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0044] The present invention will be described in detail below in conjunction with the accompanying drawings and specific embodiments, but not as a limitation of the present invention.

[0045] Such as figure 2Shown is a flow chart of the method for reading busy etc.

[0046] No matter how fast the storage medium is, its access delay still exists. In the process of high concurrent access to a large number of small files, the situation of "the page is requested but the result has not been returned yet" will appear. Such as figure 2 As shown, the traditional solution is the idle strategy, that is, waiting for the disk read delay. This is undoubtedly a strategic mistake for the scenario where subsequent pages are already in the cache. The present invention adopts a busy-waiting strategy: when an application program hits a locked page in the cache, first mark the page as "unvisited", then skip the page and access the next page, and wait until all pages are accessed, and then...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The present invention discloses a system and a method of busy-wait after pre-reading a small file in a parallel network file system. The system comprises: a client, configured to determine a data page number of the current access file according to a file layout of a current access file, and access a data page of the current access file in a disk; and a server, configured to obtain the file layout of the current access file, and send the file layout to the client. When an upper application finds a locked current data page in a cache, the client marks the current data page as "not accessed", accesses a next data page by skipping the current data page, and accesses the current data page again after accessing all data pages.

Description

technical field [0001] The invention relates to a cache hit mechanism after pre-reading data between files in a parallel network file system, in particular to a system and method for pre-reading small files in a parallel network file system and then waiting. Background technique [0002] With the advent of the era of big data, the amount of global data information is increasing rapidly. There are more and more small-sized files in fields such as e-commerce, social networking, and scientific computing. Alibaba Group, which created the largest IPO on the New York Stock Exchange, and its important e-commerce trading platform - Taobao, has stored about 28.6 billion pictures in 2010, with an average size of only 17.45KB, and pictures below 8KB accounted for 20% of the total. 61%. In a sense, there is currently no purely large file application. Human beings are gradually entering the "era of massive small files". [0003] Parallel Network File System (pNFS) adopts the separati...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F17/30
CPCG06F16/172
Inventor 杨洪章张军伟何文婷张建刚
Owner INST OF COMPUTING TECH CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products