Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Shared cache management method and system

一种共享缓存、管理系统的技术,应用在信息领域,能够解决增加硬件成本等问题,达到简化硬件实现、逻辑配置简单的效果

Inactive Publication Date: 2010-06-16
INT BUSINESS MASCH CORP
View PDF1 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0012] For collision detection, the hardware has to attach color registers and corresponding comparison logic for each buffer block, which obviously increases the hardware cost

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Shared cache management method and system
  • Shared cache management method and system
  • Shared cache management method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0025] The present invention is applicable to the technical field of transactional memory. The following description will take the operating environment of transactional memory as an example, that is, the application program running on the processor includes several transactions, and the transaction uses a shared buffer to store intermediate state data, and the transaction is indicated by a color mark The identification number (ID) of the . Of course, those skilled in the art should understand that the present invention is not limited thereto, but transactions can be abstracted into more general conceptual programs, because transactions are part of application programs.

[0026] The present inventors have observed that in shared memory schemes for hardware transactional memory systems, a major challenge lies in on-demand resource management, which is critical to system performance. For example, multiple transactions with different data sizes compete with each other to apply fo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a shared cache management system which comprises a resource management module, wherein the resource management module is configured for responding to the start of a transaction, allocating shared cache resources to the transaction according to the predicted transaction data size, responding to the successful submission of the transaction and recording the shared cache size actually occupied by the transaction to update the historical information of the transaction, wherein the predicted transaction data size of the transaction is predicted by a predictor in the operating process according to the historical information of the transaction. The invention also provides a corresponding shared cache management method. The technical scheme of the invention can be used for dynamically allocating shared cache resources to a plurality of transactions by predicting the sizes of the cache needs of the transactions.

Description

technical field [0001] The present invention relates to the field of information technology, and more specifically, the present invention relates to a shared cache management method and system. Background technique [0002] Currently, researchers have proposed a hardware-based transactional memory system to solve the problems existing in the traditional lock-based parallel programming model. Transactional memory systems allow programmers to specify regions of code (called transactions) that execute serially. That is to say, when each transaction is executed, only its corresponding code is executed in the thread. Transactional memory systems allow applications to access shared data in a parallel but atomic fashion. Transactional memory can improve the performance of parallel programs. You can refer to the article "Transactional Memory: Architectural Support for Lock-Free DataStructures" (Document 1) published by Maurice Herlihy and J. Eliot B. Moss in 1993 to understand th...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/50G06F12/08
CPCG06F9/467G06F9/30087G06F9/5016G06F9/50G06F12/08
Inventor 侯锐王华勇沈晓卫T·凯恩
Owner INT BUSINESS MASCH CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products