Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Cache-conscious concurrency control scheme for database systems

A cache and database technology, which is used in electrical digital data processing, special data processing applications, digital data information retrieval, etc., and can solve problems such as poor update performance and scalability.

Inactive Publication Date: 2004-08-18
SAP AG
View PDF0 Cites 38 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Update performance scales poorly for the number of multiprocessing units

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cache-conscious concurrency control scheme for database systems
  • Cache-conscious concurrency control scheme for database systems
  • Cache-conscious concurrency control scheme for database systems

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] I parallel control

[0030] coherent cache miss

[0031] Figure 1 shows how coherent cache misses occur in a query processing system with a traditional database management system (DBMS). A DBMS is a collection of programs that manage the structure of a database and control access to it. In the main memory 100, there is an index tree including nodes n1 to n7 (101 to 107) for accessing a database on disk or in the main memory. For simplicity, assume that each node corresponds to a cache block and contains a latch. As mentioned above, latches can guarantee that a transaction can only access a data item. There are four processors (108 to 111) accessing main memory 100 through caches (112-115).

[0032] Let us consider a situation where, on cold start of the main memory query processing system, processor p1 108 traverses the path (n1 101, n2 102, n4 104) such that these nodes are copied in cache c1 112 of p1108 . During this process, the latches on n1 and n2 are he...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

An optimistic, latch-free index traversal (''OLFIT') concurrency control scheme is disclosed for an index structure for managing a database system. In each node of an index tree, the OLFIT shceme maintains a latch, a version number, and a link to the next node at the same level of the index tree. Index traversal involves consistent node read operations starting from the root. To ensure the consistency of node read operations without latching, every node update operation first obtains a latch and increments the version number after update of the node contents. Every node read operation begins with reading the version number into a register and ends with verifying if the current version number is consistent with the register-stored version number. If they are the same, the read operation is consistent. Otherwise, the node read is retried until the verification succeeds. The concurrency control scheme of the present invention is applicable to many index structures such as the B+-tree and the CSB+-tree.

Description

technical field [0001] The present invention generally relates to database management systems. More specifically, the present invention relates to a cache-conscious concurrency control scheme for memory-resident index structures in database systems. Background technique [0002] Due to the ever-decreasing price of server DRAM (Dynamic Random Access Memory) components, main memory database management systems (MM DBMSs) have become an economically viable alternative to disk-resident database management systems (DR DBMSs) in many applications. MM DBMSs potentially outperform DRDBMSs by orders of magnitude in performance not only for read transactions but also for update transactions. [0003] However, the significant performance advantage of MM DBMS over DR DBMS is not achieved automatically, but requires special optimization techniques for MM DBMS, especially the effective use of cache. A cache, a specialized storage device with much faster access times than main memory, sto...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F12/08G06F17/30
CPCG06F17/30327G06F17/30356G06F17/30353G06F16/2246G06F16/2322G06F16/2329G06F12/08
Inventor 车相均黄祥镕金起弘权槿周
Owner SAP AG
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products