Design method for Cache control unit of protocol processor

A protocol processor and control unit technology, applied in the computer field, can solve the problems of reducing protocol processing efficiency, unable to synchronize data, increasing system delay, etc., to achieve the effect of processing complete pipeline, improving throughput, and reducing blocking

Inactive Publication Date: 2014-02-19
LANGCHAO ELECTRONIC INFORMATION IND CO LTD
View PDF2 Cites 17 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The disadvantage of this implementation is that the protocol processing pipeline cannot synchronously operate the data on the Cache regardless of whether the Cache is hit or not.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Design method for Cache control unit of protocol processor
  • Design method for Cache control unit of protocol processor
  • Design method for Cache control unit of protocol processor

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] The method of the present invention is described in detail below with reference to the accompanying drawings.

[0032] attached figure 1 Describes the functional module division of the protocol processing pipeline Cache control unit, in which the scheduling module (Command Scheduler) receives five different sources (pipeline access command, pipeline command pending queue, backfill command, backfill command pending queue, invalidation buffer command) Cache accesses instructions, arbitrates and schedules them. The invalidation buffer module (Miss Buffer) buffers the missing (Miss) instruction, and waits for the refill to be reactivated. The fill-back queue (Fill-back Queue) stores the backfill commands that have not obtained processing authority, and waits for the punching to be successfully reprocessed. The tag array (Tag Array) carries out the indexing of the multi-way group connected Cache, and calculates the hit information and the multi-way selection signal of the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a design method for a Cache control unit of a protocol processor. The Cache control unit of the protocol processor is used for controlling access of the protocol processor in a CC-NUMA system to caching. Full synchronization of Cache reading and writing operation time sequence and a streamline of the protocol processor is achieved by dispatching and hanging different Cache access instructions, and cache seamless access and full streamline protocol message processing are achieved under the condition that the Cache is idle and hit. A system is divided into a plurality of sub-modules according to functions and comprises a streamline instruction hanging queue, a dispatching module, a label array, a data array, a backfill module, an interface communication module and failure cache, asynchronous operation is replaced by Cache operation, and delay of Cache access caused by the asynchronous operation is removed. When the cache is hit, the protocol processing streamline can synchronously operate data in the Cache, so that efficiency for protocol processing is improved.

Description

technical field [0001] The invention relates to the fields of computer and integrated circuit design, in particular to a design method of a protocol processor Cache control unit. Background technique [0002] Cache, that is, high-speed cache, usually refers to a high-speed small-capacity memory between the processor and the main memory. The access speed is much faster than the main memory and matches the access speed of the processor. Cache is usually implemented based on static random access memory (SRAM). Compared with dynamic random access memory (DRAM), SRAM has the advantage of fast speed, but its disadvantages are high cost and large area. The Cache maps the content of a part of the main memory address with a few percent of the capacity of the main memory. When the data address accessed by the processor is just within its mapping, the processor can directly operate on the Cache, eliminating the need to access the main memory. Steps, the processing speed of the compute...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F12/08G06F12/0815G06F12/0844
Inventor 周恒钊陈继承
Owner LANGCHAO ELECTRONIC INFORMATION IND CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products