Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Resource allocation method and Cache

A resource allocation and high-speed cache technology, applied in the field of multi-processors, can solve problems such as waste of Cache resources, different requirements for using Cache resources, and no consideration of changes in the requirements of processor dynamic access.

Inactive Publication Date: 2018-05-11
SANECHIPS TECH CO LTD
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] In modern multi-core systems, multi-core shared high-speed cache memory (Cache) is the most basic method to improve processor access performance. However, in the multi-core system architecture, each processor core handles different tasks, and each core processes tasks. The time is different, so that each core has different requirements for the use of Cache resources; and in the current multi-core architecture, in the lock-down mode of the Cache, the processor's use of the Cache capacity is statically allocated, without consideration The processor's demand for dynamic access to the Cache changes. In this case, the access performance of cores with relatively large Cache access requirements cannot be improved, and the Cache resources used by cores with relatively small Cache requirements will be wasted, which is not conducive to the area cost and cost of multi-core systems. Balance of access performance

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Resource allocation method and Cache
  • Resource allocation method and Cache
  • Resource allocation method and Cache

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0020] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the drawings in the embodiments of the present invention.

[0021] Embodiments of the present invention provide a resource allocation method, which is applied to a Cache shared by multiple processors, wherein each processor corresponds to an identification code (ID, Identification), and the ID is used to identify the processor; Moreover, before using the Cache, the Cache capacity has been fixedly configured for each processor. Here, the Cache can be a set-associated structure, and the Cache capacity can include multiple ways, and each way includes a fixed number of lines. For example, when the Cache capacity includes 4 When there are four processors, the Cache capacity for each processor can be configured as 4 ways, and each way can include 10 lines;

[0022] Wherein, the above-mentioned Cache includes: a Cache controller and a Cache r...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

An embodiment of the invention discloses a resource allocation method. The method is applied to a Cache shared by multiple processors. The Cache comprises a Cache controller and Cache registers; and the Cache registers include a statistical register corresponding to each processor and a lock register corresponding to each processor. The method comprises the steps that each statistical register performs statistics on Cache capacity accessed by the processor corresponding to each statistical register in a preset time, obtains Cache access capacity of each processor, and sends the Cache access capacity of each processor to the Cache controller; the Cache controller determines Cache allocation capacity of each processor according to the Cache access capacity of each processor; and the Cache controller writes the Cache allocation capacity of each processor in the lock register corresponding to each processor. Meanwhile, an embodiment of the invention furthermore discloses the Cache.

Description

technical field [0001] The invention relates to the field of multiprocessors, in particular to a resource allocation method and a high-speed cache memory Cache. Background technique [0002] At present, in the chip system, the area cost is a crucial factor, and while reducing the area cost, how to ensure the performance of the processor has also become a key issue that people have been eager to solve. [0003] In modern multi-core systems, multi-core shared high-speed cache memory (Cache) is the most basic method to improve processor access performance. However, in the multi-core system architecture, each processor core handles different tasks, and each core processes tasks. The time is different, so that each core has different requirements for the use of Cache resources; and in the current multi-core architecture, in the lock-down mode of the Cache, the processor's use of the Cache capacity is statically allocated, without consideration The processor's demand for dynamic ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F9/50
CPCG06F9/5016G06F9/50
Inventor 薛长花孙志文
Owner SANECHIPS TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products