Main memory database access optimization method on basis of page coloring technology

A page coloring and optimization method technology, applied in the field of database management, can solve problems such as quota conflicts, few page colors, and allocation of large data sets that cannot be weakly localized, and achieve the effect of reducing cache conflicts

Active Publication Date: 2012-09-12
RENMIN UNIVERSITY OF CHINA
View PDF5 Cites 21 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In-memory databases cannot allocate fewer page colors for large datasets with weak locality
The second challenge is that if the dynamic page coloring technology is used, the page color of the weak locality data page can be changed through the method of memory copy before the memory data is accessed. Although it can solve the problem that the weak locality data set cannot allocate fewer page colors, but The latency of memory copying seriously affects the overall performance of data access
At the same time, weak locality datasets only need a small range of page colors but actually require a lot of memory address space to store larger datasets
It is difficult to meet the quota contradiction between the memory address space and the page color range at the same time

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Main memory database access optimization method on basis of page coloring technology
  • Main memory database access optimization method on basis of page coloring technology
  • Main memory database access optimization method on basis of page coloring technology

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033] In the present invention, the cache (cache) mainly refers to the CPU cache at the processor hardware level. It is different from the database software-level buffer (buffer), whose access control is determined by hardware instructions, and the database software cannot actively manage it like a buffer. The bottom-level goal of cache access is to optimize the access mode of data sets with different access characteristics, so process-level cache optimization technology still has a large room for optimization. For this reason, the present invention provides a cache access optimization method applied to memory databases. This method is based on the current mainstream cache replacement algorithm for multi-core processor shared CPU cache - n-way set associative replacement (n-way set associative) algorithm, optimizes the data page access sequence according to the page color of memory data, thereby reducing OLAP query processing When the cache access conflicts, improve the over...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a main memory database access optimization method on the basis of a page coloring technology. The method comprises the following steps of: firstly, carrying out sorting on an access sequence of all data pages of a weak locality dataset according to page colors and carrying out grouping on all the data pages according to the page colors; then scanning all the data pages of the weak locality dataset according to a sequence of grouping the page colors; furthermore, presetting a plurality of memory pages with the same page color into a page color queue, wherein the page color queue is used as a memory cache before the memory pages are uploaded into a CPU (Central Processing Unit) cache; and ensuring the data pages of the weak locality dataset to firstly enter the page color queue by an asynchronous mode and then be uploaded into the CPU cache to complete the data processing. The invention can solve the problem that in the main memory database application, a cache address space cannot be optimized and distributed for processes, threads or the datasets by the page colors, and effectively reduces cache conflicts between the datasets with different data locality intensities.

Description

technical field [0001] The present invention relates to a memory database access optimization method, in particular to an access control based on page-coloring technology (Page-coloring) for data sets with different data locality strengths, so as to control the CPU cache (cache) access used by the memory database The optimization method belongs to the technical field of database management. Background technique [0002] An in-memory database is a database in which all data resides in memory, rather than in external storage as traditional databases do. Its notable feature is that all data access control is performed in memory, so the data read and write speed is several orders of magnitude higher than that of disk databases, which can greatly improve the performance of database applications. Compared with disk databases, in-memory databases have redesigned the architecture and made corresponding improvements in data caching, query optimization, and parallel operations. [0...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F17/30
CPCG06F12/0842
Inventor 王珊张延松
Owner RENMIN UNIVERSITY OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products