A transparent edge-of-network data cache

A high-speed cache and edge data technology, applied in data processing applications, electrical digital data processing, special data processing applications, etc., can solve problems such as complex consistency management, complex maintenance, and invalid query response

Active Publication Date: 2005-12-14
IBM CORP
View PDF0 Cites 27 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Query response caching eliminates administrator overhead, but suffers from limited availability and high space overhead
Also, consistency management is complicated by a mismatch between the representation of the cached data and the underlying data in the origin server
Consistency control typically requires either invalidating all query responses when any base table changes, or maintaining complex dependency graphs

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A transparent edge-of-network data cache
  • A transparent edge-of-network data cache
  • A transparent edge-of-network data cache

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] Embodiments of the present disclosure provide a dynamic database cache to be maintained by a local machine allowing database queries to remote servers. This cache utilizes the local database engine to maintain a partial but semantically consistent "materialized view" of previous query results. It is dynamically populated based on the application query stream. Containment checkers that work on query predicates are used to determine whether the results of a new query are contained in the cached result union. Ordinary local tables are used to share possible physical memory between overlapping query results. Data consistency is maintained by propagating inserts, deletes, and updates from source databases to their cached local copies. The background purge algorithm continuously or periodically deletes the contents of ordinary local tables by evicting excess rows propagated by the coherence protocol and rows belonging to queries that have been marked for eviction from the c...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The present invention provides a system (100), apparatus (200) and method (300) for dynamically caching data according to a query performed by a local application, wherein the system includes a remote server (108), an edge server (109) local database (104) on the device; the device includes an edge data cache (202), which includes a query evaluator (204), a cache index (206), a cache repository (208), a resource manager (209), comprising a checker (207), a query resolver (205), and a consistency manager (210), all of which are in signal communication with the query evaluator; this method is used to dynamically cache the previous database of the remote server Results of the query (412), associating the local database with the local server, storing the multiple cached results in a shared table (503) of the local database, and using the multiple cached results to satisfy the new database for the local server Inquiry (410).

Description

[0001] Cross References to Related Applications [0002] This application claims U.S. Patent Application 10 / 328,229 (Attorney Docket YOR920020341US1 / 8728-600), the disclosure of which is hereby incorporated by reference in its entirety. technical field [0003] This disclosure relates to database caching over a distributed network. Background technique [0004] The proliferation of distributed web applications has increased the frequency of application queries to remote database servers. To improve the performance of such queries and enhance data availability, such applications may use a local database cache. For example, edge servers in a content distribution network can use proximity database caches to speed up data access and generate dynamic web content more quickly at the edge of the network. [0005] Typical techniques for caching data on edge servers rely on: (i) explicit replication of the entire database or explicit portions thereof on the local machine; or (ii)...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F7/00G06F15/16G06F17/30
CPCG06F16/24552G06F16/24539G06F15/16
Inventor 卡利尔·S.·阿米瑞斯瑞拉姆·派德马纳伯翰朴商贤莱纽·泰瓦瑞
Owner IBM CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products