Persistent cache layer in a distributed file system

Active Publication Date: 2018-12-04
EMC IP HLDG CO LLC
View PDF13 Cites 27 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

When a file system operation targeted to an inode is being processed, the inode itself can be placed under lock conditions, impacting other file system processes that desire access to the same inode.
In addition, the size of an inode can be limited, such that when metadata relating to the file the ino

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Persistent cache layer in a distributed file system
  • Persistent cache layer in a distributed file system
  • Persistent cache layer in a distributed file system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0022]The innovation is now described with reference to the drawings, wherein like reference numerals are used to refer to like elements throughout. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of this innovation. It may be evident, however, that the innovation can be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to facilitate describing the innovation.

[0023]As used herein, the term “node” refers to a physical computing device, including, but not limited to, network devices, servers, processors, cloud architectures, or the like. In at least one of the various embodiments, nodes may be arranged in a cluster interconnected by a high-bandwidth, low latency network backplane. In at least one of the various embodiments, non-resident clients may communicate to the nodes in a cluster through high-latenc...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

Implementations are provided herein for having at least two data streams associated with each file in a file system. The first, a cache overlay layer, can store additional state information on a per block basis that details whether each individual block of file data within the cache overlay layer is clean, dirty, or indicates that a write back to the storage layer is in progress. The second, a storage layer, can be a use case defined repository that can transform data using data augmentation methods or store unmodified raw data in local storage. File system operations directed to the cache overlay layer can be processed asynchronously from file system operations directed to the storage layer.

Description

CROSS REFERENCE TO RELATED APPLICATIONS[0001]This application is related to co-pending U.S. patent application Ser. No. 14 / 501,881 for DATA AND METADATA STRUCTURES FOR USE IN TIERING DATA TO CLOUD STORAGE, which is incorporated herein by reference for all purposes, and to co-pending U.S. patent application Ser. No. 14 / 501,806 for GARBAGE COLLECTION OF DATA TIERED TO CLOUD STORAGE, which is incorporated herein by reference for all purposes, and to co-pending U.S. patent application Ser. No. 14 / 501,958 for TIERING DATA TO CLOUD STORAGE USING CLOUDPOOLS AND POLICIES, which is incorporated herein by reference for all purposes, and to co-pending U.S. patent application Ser. No. 14 / 501,756 for WRITING BACK DATA TO FILES TIERED IN CLOUD STORAGE, which is incorporated herein by reference for all purposes, and to co-pending U.S. patent application Ser. No. 14 / 501,928 for USING A LOCAL CACHE TO STORE, ACCESS AND MODIFY FILES TIERED TO CLOUD STORAGE, which is incorporated herein by reference f...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F12/08G06F17/30G06F12/0893G06F12/0804
CPCG06F12/0893G06F12/0804G06F17/30091G06F17/30132G06F17/30194G06F2212/1024G06F2212/163G06F2212/608G06F16/13G06F16/172G06F16/182
Inventor LAIER, MAXPOPOVICH, EVGENYKIM, HWANJU
Owner EMC IP HLDG CO LLC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products