Management method for instruction cache and processor

An instruction cache and processor technology, applied in the computer field, can solve problems such as no improvement, frequent occurrence, insufficient shared I-Cache resources, etc.

Active Publication Date: 2014-12-31
HUAWEI TECH CO LTD +1
View PDF4 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, when the shared I-Cache resources assigned to each hardware thread are insufficient, the I-Cache miss rate increases, and the missing requests sent from the I-Cache to the next-level Cache will occur frequently, and instructions will be retrieved from the next-level Cache. When backfilling, when the thread data increases, the Cache line where the filled instruction is located will be immediately filled into the missing I-Cache and will not be used immediately, and the replaced Cache line may be used again
[0004] In addition, when adjusting the scheduling strategy of Thread (thread) according to the Cache hit situation, it will try to ensure that the thread with a high hit rate of memory access instructions in the Cache is prioritized for a period of time, but for the shared I allocated to each hardware thread -The problem of insufficient Cache resources has not been improved

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Management method for instruction cache and processor
  • Management method for instruction cache and processor
  • Management method for instruction cache and processor

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0039] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0040] In the design of modern multi-threaded processors, as the number of hardware threads increases, there will be insufficient shared resources for each hardware thread. For example, for the important shared resource of L1 (Level 1) Cache in Cache Even more so with resources. The instruction cache capacity of the L1Cache assigned to each hardware thread is too small, there will be misses in the L1, and the L1 miss rate will increase, resulting in increased ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The embodiment of the invention provides a management method for an instruction cache and a processor, relates to the field of computers, and aims to expand the instruction cache capacity of a hardware thread, lower the missing rate of the instruction cache and enhance the system performance. A hardware thread identifier in a shared instruction cache of the processor is used for identifying a hardware thread which corresponds to a cache line in the shared instruction cache; a private instruction cache is used for storing an instruction cache line substituted from the shared instruction cache; a missing cache is further provided. The method comprises the following steps: when the hardware thread of the processor acquires an instruction from the instruction cache, accessing the shared instruction cache in the instruction cache and the private instruction cache which corresponds to the hardware thread simultaneously; determining the existence of an instruction in the shared instruction cache and the private instruction cache which corresponds to the hardware thread, and acquiring the instruction from the shared instruction cache and the private instruction cache which corresponds to the hardware thread according to a judgment result. The embodiment of the invention is used for managing the instruction cache of the processor.

Description

technical field [0001] The invention relates to the field of computers, in particular to an instruction cache management method and a processor. Background technique [0002] CPU (Central Processing Unit, central processing unit) cache (Cache Memory) is a temporary storage located between the CPU and the memory, and its capacity is much smaller than that of the memory. CPU read speed. [0003] In a multi-threaded processor, multiple hardware threads obtain instructions from the same I-Cache (instruction cache). When there is no instruction to be obtained in the I-Cache, while sending a missing request to the next-level Cache, Switching to other hardware threads to access the I-Cache continues to fetch instructions, reducing the pause of the pipeline caused by the lack of I-Cache and improving the efficiency of the pipeline. However, when the shared I-Cache resources assigned to each hardware thread are insufficient, the I-Cache miss rate increases, and the missing requests...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F12/08G06F12/0842G06F12/0875
CPCG06F9/3802G06F9/3851G06F12/0842G06F12/0875
Inventor 郭旭斌侯锐冯煜晶苏东锋
Owner HUAWEI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products