Constant Cache capable of supporting pipeline operation

A constant, pipelining technique, applied to constant Cache. field

Inactive Publication Date: 2018-06-08
XIAN AVIATION COMPUTING TECH RES INST OF AVIATION IND CORP OF CHINA
View PDF9 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

If the data accessed once hits in the cache, it only takes 1 or 2 processor cycles to get the data; but if the required data is not in the cache, it often takes an order of magnitude more processor cycles to get the data

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Constant Cache capable of supporting pipeline operation
  • Constant Cache capable of supporting pipeline operation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0027] Such as figure 1 As shown, when the constant Cache receives a parameter request, the tag control module will compare the address of the parameter request with the mapping address saved in the tag register, and if it hits, it will directly read the data from the memory and return it to the requester; if If it is not hit, save the request to FIFO. Once it is detected that the FIFO module is not empty, the tag register module will read the FIFO and perform a second comparison. If it still misses, it will generate a read request for the parameter space. After the data is read back, the update address generated according to the PLRU algorithm , to update the corresponding memory and tag registers. During the above process, the Cache can still receive parameter requests, and the miss requests will be stored in the FIFO and wait for processing.

[0028] For the PLRU module, by using the MRU (Most Recently Used) bit to mark the historical access situation of each Cache block ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to the technical field of computer hardware, in particular to a constant Cache capable of supporting pipeline operation. The constant Cache comprises a PLRU algorithm module 2, alabel control module 1, an FIFO module 3 and a memory module 4. The PLRU algorithm module 2 is used for generating a Cache block address to be updated according to historical access information of each Cache block and sending the Cache block address to the label control module 1; the label control module 1 is used for determining whether or not a current cache request is hit, and sending a judgment result to the PLRU algorithm module 2, and if the current cache request is hit, data mapped to the constant Cache by a memory is directly read from the memory module 4 and sent back to a request initiator; if the current cache request is not hit, a parameter address is buffered to the FIFO module 3, the data is written into the memory module 4 according to the Cache block address to be updatedafter the data is retrieved from the memory, the parameter address is read out from the FIFO module 3 and processing is conducted; the FIFO module 3 is used for storing a parameter address which is not hit; the memory module 4 is used for storing the data mapped to the constant Cache to wait for reading after the constant Cache is hit.

Description

technical field [0001] The invention relates to the technical field of computer hardware, in particular to a constant Cache supporting pipelining. Background technique [0002] With the rapid development of computer systems, modern computer systems are increasingly limited by the performance of the main memory. The performance of the processor is increasing at a rate of 60% per year, while the bandwidth of the main memory chip is only increasing at a rate of 10% per year. In terms of speed, the gap between main memory and processor has always been about an order of magnitude. As the speed gap between CPU and memory is getting bigger and bigger, Cache appears, which is between CPU and main memory. Its access speed is close to CPU speed, but its capacity is small and its price is high. [0003] The emergence of Cache is a helpless compromise and has become one of the key factors affecting system performance. Embedded systems are now becoming an important part of the compute...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F12/0871G06F9/38G06F5/06
CPCG06F5/06G06F9/3867G06F12/0871
Inventor 牛少平魏艳艳韩一鹏郝冲邓艺
Owner XIAN AVIATION COMPUTING TECH RES INST OF AVIATION IND CORP OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products