Method and apparatus for dynamic resizing of cache partitions based on the execution phase of tasks

A cache and execution stage technology, applied in the direction of memory architecture access/allocation, memory address/allocation/relocation, memory system, etc., to achieve the effect of avoiding retention and realizing effective utilization

Inactive Publication Date: 2009-03-25
NXP BV
View PDF1 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Current cache partitioning techniques do not add

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and apparatus for dynamic resizing of cache partitions based on the execution phase of tasks
  • Method and apparatus for dynamic resizing of cache partitions based on the execution phase of tasks
  • Method and apparatus for dynamic resizing of cache partitions based on the execution phase of tasks

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0023] The above and other features, aspects and advantages of the present invention are described in detail below in conjunction with the accompanying drawings. The accompanying drawings include 6 figures.

[0024] figure 1 An embodiment of a method for dynamically resizing cache partitions for application tasks in a multiprocessor is illustrated. The execution stage 101 of each application task is identified by using the basic block vector (BBV) metric or working set of the application task. The phase information and work sets 102 of the application tasks are stored in the form of a table. Then, according to the execution stage of the application task, the cache partition is dynamically configured using the stage information.

[0025] According to the proposed invention, the size of the cache partition is adjusted during certain situations of application task execution such that at any given point in time, the application task is allocated a necessary and sufficient amoun...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The present invention proposes a method and a system for dynamic cache partitioning for application tasks in a multiprocessor. An approach for dynamically resizing cache partitions based on the execution phase of the application tasks is provided. The execution phases of the application tasks are identified and updated in a tabular form. Cache partitions are resized during a particular instance of the execution of application tasks such that the necessary and sufficient amount of cache space is allocated to the application tasks at any given point of time. The cache partition size is determined according to the working set requirement of the tasks during its execution, which is monitored dynamically or statically. Cache partitions are resized according to the execution phase of the task dynamically such that unnecessary reservation of the entire cache is avoided and hence an effective utilization of the cache is achieved.

Description

technical field [0001] The present invention generally relates to data processing systems including cache memory, and more particularly, to dynamic partitioning of cache memory for application tasks in a multiprocessor. Background technique [0002] Cache partitioning is a well-known technique in multitasking systems that achieves more predictable cache performance by reducing resource interference. In a data processing system that includes multiple processors, cache memory is shared among multiple processes or tasks. The cache memory is divided into different sections for different application tasks. It is advantageous to divide the cache into multiple sections, where each section is allocated to various types of processes, rather than the various types of processes sharing the entire cache memory. When the cache memory is divided into multiple parts, the problem of how to determine the size of the cache partition for different application tasks and when to adjust the siz...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F12/08G06F12/12
CPCG06F12/0808G06F12/0842G06F2212/601G06F12/12G06F12/084G06F12/0802
Inventor 毕尤·托马斯斯里拉姆·克里斯兰米林德·马诺哈尔·库尔卡尼萨拿斯·卡拉帕姆
Owner NXP BV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products