Unlock instant, AI-driven research and patent intelligence for your innovation.

Age-based strategy for determining database cache hits

A high-speed cache and database technology, applied in relational databases, database models, database updates, etc., can solve problems such as inability to scale well, long delays, and inability to maintain throughput

Active Publication Date: 2021-07-13
ORACLE INT CORP
View PDF10 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the methods described above do not scale very well, such as when the number of customer records increases to tens of thousands, millions, or even billions
In such cases where high throughput on the order of hundreds, thousands, tens of thousands or more messages per second is desired, the methods described above exhibit delays that are too long to maintain the desired throughput
Furthermore, when dealing with random reads of data associated with database lookups, the latency of each generated message is highly variable
[0005] The observed latency can be smoothed out by introducing caching facilities; however, if the data is constantly changing on the backend database, the cache will often be "dirty" or out of sync with the backend database

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Age-based strategy for determining database cache hits
  • Age-based strategy for determining database cache hits
  • Age-based strategy for determining database cache hits

Examples

Experimental program
Comparison scheme
Effect test

Embodiment approach 1

[0152] One common way marketers can use the system is by uploading personalization data into a master database and then performing bulk activations on targeted recipients. In this case, the startup start time is used for snapshots when determining data recency for cache lookups. For example, it is conceivable that data in the primary database may have been modified just after the startup start time, and it may not be able to synchronize the cache data with these changes. However, when looking up personalization data during startup, cached data may be used as long as (eg, as determined from corresponding personalization data metadata) the data in the cache is recent at the startup start time. Otherwise, lookups fall back to the primary database.

Embodiment approach 2

[0154] A message sent as part of a batch launch may contain a link to a web page. When the end user clicks on the link, the web page is served together with the recipient personalization data. Here again, the link will store the "AsOfTime" corresponding to the start time of the batch launch. Thus, when a web page is personalized, the cache serves lookup requests only if the cache data is current at the time of the request.

Embodiment approach 3

[0156] Often, marketers want to preview messages before they launch a launch that may be targeted to several recipients (eg, millions of recipients). In situations involving such previews, one may not want to use cached data. In this case, "AsOfTime" will be set to "LATEST" and then the lookup always falls back to the primary database.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Systems and methods for high performance data processing are provided. Receive a policy indicating a deadline. The request is processed so that the data item is returned from the cache if the data item meets the deadline, or the data item is returned from the database or not at all if the data item does not meet the deadline. If a data item is retrieved from the database, metadata associated with the data item is stored to indicate when the data item was last marked for synchronization with the database.

Description

technical field [0001] This disclosure relates to the field of dedicated high performance database caches, and more particularly to techniques for determining age-based policies for database cache hits or misses. Background technique [0002] Marketers are constantly seeking better ways to create, execute and automate campaigns designed to drive revenue and strengthen customer loyalty. A workflow engine can be used to generate output material (eg, email messages) for use in the conduct of a marketing campaign. Marketers can use the workflow engine to configure a series of connected workflow execution components that make up a campaign. Some engines allow marketers to visually design, manage and automate multi-stage lifecycle marketing plans through a drag-and-drop user interface and a library of pre-built plan templates. [0003] Certain aspects of such marketing programs and the marketing campaigns comprising them arise in the process of generating personalized messages (...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F16/2453G06F16/23G06F16/28G06Q30/02
CPCG06F16/24539G06Q30/0271G06F16/2365G06F16/284
Inventor J·T·图瓦提尼B·H·瑟杰恩特邹青A·A·萨鲁瓦
Owner ORACLE INT CORP