Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Directory cache management method for big data application

A cache management and big data technology, applied in digital data processing, memory system, memory address/allocation/relocation, etc., can solve problems such as long delay, directory jitter, high data reuse, etc., and achieve good promotion and use value , reduce memory access delay, and reduce the effect of replacement times

Active Publication Date: 2015-03-25
LANGCHAO ELECTRONIC INFORMATION IND CO LTD
View PDF3 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, compared with traditional applications, Scale-Out applications have their special features. The first data set is large, and the data sets that Scale-Out applications need to process are often more than gigabytes (M), which is far beyond the current Processor on-chip cache capacity; second limited sharing, Scale-Out type applications have data sharing, mainly focusing on instruction sharing and task communication and collaboration sharing, the data set is about several megabytes; third locality, Scale-Out type applications There is locality, and there is high data reuse for small-scale data
[0003] Traditional cache coherence protocols are designed for traditional applications, and the efficiency for Scale-Out type applications is not high. There are two main reasons: first, directory jitter; second, high latency
The cause of directory jitter is that the directory replacement strategy lacks data access information in the private cache, and a large amount of data competes for the limited directory capacity, resulting in the replacement of frequently used data in the private cache.
The reason for the high access delay is that there is no need to maintain data consistency for the private data itself, but in the traditional design, it is necessary to compete for the directory cache, and there is a long time delay when the directory cache is replaced, which seriously affects system performance

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0024] A directory cache management method oriented to big data applications of the present invention adds a shared flag bit and a data block pointer in the last-level shared cache (Last-Level-Cache, LLC), and the shared flag bit is used to distinguish whether the data is private data or It is shared data, the data block pointer is used to track the location of private data in the private cache, and the directory cache (Directory Cache, DC) is used to maintain the consistency of shared data; based on the last-level shared cache and directory cache, the data is divided into private data and Shared data; private data does not occupy the space of the directory cache, and maintains data consistency through the directory in the private cache; shared data occupies the space of the directory cache, and maintains data consistency through the directory cache.

[0025] The control process of the shared flag bit and the data block pointer is as follows: The shared flag bit is used to mark...

Embodiment 2

[0035] A directory cache management method oriented to big data applications of the present invention adds a shared flag bit and a data block pointer in the last-level shared cache (Last-Level-Cache, LLC), and the shared flag bit is used to distinguish whether the data is private data or It is shared data, the data block pointer is used to track the location of private data in the private cache, and the directory cache (Directory Cache, DC) is used to maintain the consistency of shared data; based on the last-level shared cache and directory cache, the data is divided into private data and Shared data; private data does not occupy the space of the directory cache, and maintains data consistency through the directory in the private cache; shared data occupies the space of the directory cache, and maintains data consistency through the directory cache.

[0036] The last-level shared cache (Last-Level-Cache, LLC) contains three parts of data: tag (Tag), data (Data) and status bit ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a directory cache management method for big data application, and belongs to directory cache management methods. The method comprises the steps that shared flag bits and data block pointers are added in a last stage shared cache, the shared flag bits are used for distinguishing whether data are private data or shared data, the data block pointers are used for tracing the positions of the private data in a private cache, and a directory cache is used for maintaining the consistency of the shared data; the data are divided into the private data or the shared data based on the last stage shared cache and the directory cache; the private data do not occupy the space of the directory cache, and the private cache is used for maintaining the consistency of direction data; the shared data occupy the space of the directory cache, and the directory cache is used for maintaining the consistency of the data. According to the directory cache management method for the big data application, the conflict and replacing frequency of the directory cache can be lowered, the memory access delay of the private data is shortened, and the performance of a multi-core processor system is improved.

Description

technical field [0001] The invention relates to a directory cache management method, in particular to a directory cache management method for big data applications. Background technique [0002] With the rapid development of online shopping, search, Internet of Things and data mining and other fields, the amount of data that data centers need to process is increasing rapidly. The data center uses the Scale-Out method to expand capacity, which is easy to operate and low in cost, and has gradually become the mainstream direction of future data center development. However, compared with traditional applications, Scale-Out applications have their special features. The first data set is large, and the data sets that Scale-Out applications need to process are often more than gigabytes (M), which is far beyond the current On-chip cache capacity of the processor; the second limited sharing, Scale-Out type applications have data sharing, mainly focusing on instruction sharing and ta...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F12/08G06F12/0817G06F12/084
Inventor 唐士斌
Owner LANGCHAO ELECTRONIC INFORMATION IND CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products