Artificial neural network-based LRU Cache prefetching mechanism performance gain assessment method

A technology of artificial neural network and neural network model, applied in the direction of neural learning method, biological neural network model, neural architecture, etc., can solve the problems of long simulation cycle, calculation and modeling affecting stack distance, shorten the evaluation cycle, Quick estimate of performance

Active Publication Date: 2018-09-07
SOUTHEAST UNIV
View PDF5 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The disadvantage is that the obtained stack distance distribution can only reflect the characteristics of the memory access instruction flow at the software logic level, so the stack distance distribution cannot be directly applied to the Cache behavior modeling using the prefetch mechanism, but it can be used for Cache access before the prefetch mechanism is introduced. Missing count prediction
To put it simply, the prefetching mechanis

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Artificial neural network-based LRU Cache prefetching mechanism performance gain assessment method
  • Artificial neural network-based LRU Cache prefetching mechanism performance gain assessment method
  • Artificial neural network-based LRU Cache prefetching mechanism performance gain assessment method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0029] Such as figure 1 As shown, the LRU Cache prefetching mechanism performance benefit evaluation method based on artificial neural network of the present invention, concrete realization can comprise the following steps:

[0030] (1) Extraction of artificial neural network training set:

[0031] Artificial neural network training needs multiple sets of training data as input to complete the training of neuron weight coefficients. The present invention cuts the complete application program into several program fragments, and extracts two types of information from each program fragment, one is the stack distance distribution before the prefetch mechanism is added, and the other is the number of cache access misses after the prefetch mechanism is added .

[0032] (2) Selection of artificial neural network topology and neuron weight training method:

[0033] The selection of artificial neural network topology and neuron weight training method is realized by traversing all th...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an artificial neural network-based LRU Cache prefetching mechanism performance gain assessment method. The method comprises the following steps of: selecting a neural network training parameter to fit an access stack distance distribution before the import of a prefetching mechanism and a Cache access missing frequency after the import of the prefetching mechanism, so as toconstruct a neural network model; calculating a target stack distance distribution of target programs; importing the calculated target stack distance distribution into the constructed neural model soas to predict Cache access missing frequencies of different target programs under the current prefetching mechanism; and calculating a Cache access missing frequency before the import of the prefetching mechanism by utilizing the stack distance distribution, and comparing the predicted Cache access missing frequencies under the current prefetching mechanism with the Cache access missing frequencybefore the import of the prefetching mechanism so as to assess performance gain of the prefetching mechanism. According to the method, the speed of predicting the performance gains of Cache prefetching mechanisms can be greatly enhanced.

Description

technical field [0001] The invention belongs to the technical field of computer architecture and modeling, and in particular relates to a method for evaluating the performance benefit of an LRU Cache prefetching mechanism based on an artificial neural network. Background technique [0002] Pre-silicon architecture evaluation and design space exploration based on hardware behavior modeling can provide guidance for chip design and reduce the iterative cycle of chip design. Under the modern processor architecture, the introduction of on-chip cache (Cache) can speed up memory access and improve CPU operating efficiency. However, the lack of Cache access will cause bubbles in the processor pipeline and even cause pipeline blockage, thereby reducing the computing performance of the processor. In order to improve the Cache access hit rate, the Cache design will introduce instruction or data prefetching to move the content that may be accessed in the future to the Cache in advance....

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F17/50G06N3/04G06N3/08
CPCG06N3/08G06F30/367G06N3/044
Inventor 凌明季柯丞张凌峰李宽时龙兴
Owner SOUTHEAST UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products