Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Data prefetching method based on location awareness in on-chip cache network

A data prefetch and network technology, applied in the field of storage access of many-core processors, can solve problems such as increasing programming difficulty, dependence, and increasing compiler complexity, so as to reduce conflicts, improve accuracy, and efficiently prefetch data Effect

Active Publication Date: 2015-04-01
NAT UNIV OF DEFENSE TECH
View PDF4 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although the software prefetching method improves the flexibility and adaptability of prefetching, it requires programmers or compilers to insert prefetching operations, which increases the difficulty of programming or increases the complexity of compiler implementation; in addition, software prefetching The effect of also greatly depends on the accuracy of software prefetching
[0006] Compared with a specific processor core, the data read time of many Cache blocks in the on-chip Cache network is very different. It is necessary to place the prefetched data in the Cache block as close as possible to the processor core. Therefore, for On-chip Cache network of many-core processors, software prefetching is very complicated, not only the timing of prefetching but also the location of prefetched data storage in the on-chip Cache network must be considered

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Data prefetching method based on location awareness in on-chip cache network
  • Data prefetching method based on location awareness in on-chip cache network
  • Data prefetching method based on location awareness in on-chip cache network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] The following further describes the present invention with reference to the accompanying drawings of the specification and specific preferred embodiments, but the protection scope of the present invention is not limited thereby.

[0030] Such as figure 2 Said, the location-aware data prefetching method in the on-chip Cache network in this embodiment includes the following steps:

[0031] 1) After the processor is started, the Cache block information in the on-chip Cache network of the data taken into the on-chip Cache network of the processor is recorded based on the data location vector table;

[0032] 2) When the processor core executes the prefetch instruction embedded in the program, jump to step 3);

[0033] 3) Look up the Cache block information of the target data of the prefetch instruction instruction in the on-chip Cache network in the data location vector table. When the target data is prefetched from the memory, it is stored in the destination Cache block specified b...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a data prefetching method based on location awareness in an on-chip cache network. The data prefetching method based on location awareness in the on-chip cache network comprises the following steps: 1) after a processor is started, recording cache block information of data fetched into the on-chip cache network on the processor in the on-chip cache network by using a data location vector table; 2) when a processor core executes a prefetching instruction in a program, skipping to execute step 3); 3) searching the cache block information of target data of the prefetching instruction in the on-chip cache network from the data location vector table, prefetching the target data, then storing the target data in a cache block designated by the prefetching instruction, and updating the cache block information of the target data in the on-chip cache network. The data prefetching method based on location awareness in the on-chip cache network can realize prefetching of data in the on-chip cache network on the many-core processor, and has the advantages of simple realization principle, high prefetching accuracy and high flexibility.

Description

Technical field [0001] The invention relates to the technical field of storage access for many-core processors, and in particular to a data prefetching method based on location awareness in an on-chip Cache network. Background technique [0002] The memory wall means that the time for the processor to access the memory to obtain instructions and data is much higher than the time for executing instructions in the processor, which is one of the main factors restricting the performance of the processor. At present, with the rapid development of on-chip multi-core and even many-core architectures, the memory wall problem of many-core processors has become more prominent. In order to alleviate the storage wall problem, multi-level Cache (cache) is usually designed on the processor chip. Each processor core of the many-core processor is usually designed to match the local Cache, and the on-chip Cache of many processor cores constitutes the on-chip Cache network. During the working pr...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F12/08G06F12/0862
Inventor 杨灿群李春江王锋黄春杜云飞彭林左克李宽姜浩
Owner NAT UNIV OF DEFENSE TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products