Supercharge Your Innovation With Domain-Expert AI Agents!

Cache processing method and protocol processor cache control unit

A protocol processor and high-speed cache technology, applied in the computer field, can solve the problems of large system delay and low processing efficiency, and achieve the effect of improving efficiency, improving throughput, and eliminating the problem of Cache access delay

Active Publication Date: 2013-05-01
INSPUR BEIJING ELECTRONICS INFORMATION IND
View PDF3 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The technical problem to be solved by the present invention is to provide a cache processing method and a protocol processor cache control unit to solve the problems of low processing efficiency and large system delay caused by the asynchronous design of the cache control unit and the protocol processing pipeline in the prior art

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cache processing method and protocol processor cache control unit
  • Cache processing method and protocol processor cache control unit
  • Cache processing method and protocol processor cache control unit

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0053] Such as figure 1 As shown, the protocol processor cache control unit includes a resolution scheduling module 101 , a tag array module 102 , a data array module 103 , and an interface communication module 104 . The number of channels in the tag array module 102 and the data array module 103 is the same, both are N channels, and N is a positive integer, for example, N is 8.

[0054] The parsing and dispatching module 101 is used to receive instructions from different sources, and after dispatching and arbitrating each instruction, assign a processing authority to one or more instructions therein. When the instruction obtaining the processing authority is a pipeline cache access instruction, it is parsed, and the The decoded data is sent to the tag array module 102, and the decoded data and the decoded address are sent to the tag array module 102 and the data array module 103; it is also used to send the data to be written when the pipeline cache access instruction is a wr...

Embodiment 2

[0064] Such as figure 2 As shown, the protocol processor cache control unit includes a backfill module 105 and an invalidation cache module 106 in addition to the resolution scheduling module 101 , tag array module 102 , data array module 103 , and interface communication module 104 described in Embodiment 1.

[0065] During backfill processing:

[0066] The interface communication module 104 is also configured to notify the backfill module 105 after receiving the data response from the lower-level storage.

[0067] The backfill module 105 is used for initiating a cache backfill instruction to the parsing scheduling module 101 .

[0068] The parsing scheduling module 101 is also used to execute the backfill instruction to perform cache backfill when assigning a processing authority to the backfill instruction, and to place the backfill instruction into the backfill instruction suspension queue when the backfill instruction is not assigned a processing authority. After assig...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a cache processing method and a protocol processor cache control unit. The cache control unit comprises a parsing and scheduling module, a label array module, a data array module and an interface communication module, wherein the parsing and scheduling module is used for transmitting decoded data to the label array module and transmitting the decoded data and a decoded address to the label array module and the data array module; the label array module is used for judging a mark, named as hit information, of a hit path according to the decoded data; the data array module is used for gating data of a corresponding path in the hit information; and the interface communication module is used for forwarding hit success information or simultaneously forwarding data received from the data array module. According to the cache processing method and the protocol processor cache control unit, the conventional asynchronous operation mode is substituted by a synchronous cache operation mode of the cache control unit and a protocol processing flow line, and the problem of cache access delay brought by the asynchronous operation mode is eliminated.

Description

technical field [0001] The invention relates to the fields of computer and integrated circuit design, in particular to a cache processing method and a protocol processor cache control unit. Background technique [0002] Cache (Cache) usually refers to a high-speed small-capacity memory between the processor and the main memory. The access speed is much faster than the main memory and matches the access speed of the processor. Cache is usually implemented based on Static Random Access Memory (SRAM for short). Compared with Dynamic Random Access Memory (DRAM for short), SRAM has the advantage of fast speed, but its disadvantages are high cost and large area. The Cache maps the content of a part of the main memory address with a few percent of the capacity of the main memory. When the data address accessed by the processor is just within its mapping, the processor can directly operate on the Cache, eliminating the need to access the main memory. Steps, the processing speed of ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F12/08G06F15/76G06F12/0877
Inventor 周恒钊陈继承
Owner INSPUR BEIJING ELECTRONICS INFORMATION IND
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More