Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Queue buffer management method, system, storage medium, computer equipment and application

A cache management and queue technology, applied in the field of data exchange, can solve problems such as increasing design difficulty, complex cache management, and increasing overhead, and achieve effective caching and high-speed forwarding, simple management, and improved overall speed.

Active Publication Date: 2022-06-21
XIDIAN UNIV
View PDF7 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] At present: In order to increase the exchange rate of data frames, on-chip storage resources are usually used to store data frames, but on-chip resources are very precious. Due to the large number of queues to be managed in the queue management module of the switching system, in order to avoid the data of different queues Frames will not be stored together out of order, which requires data frames of different queues (even different priorities) to be stored in different RAM areas, while on-chip storage resources are limited, and on-chip Block RAM has fixed specifications (36K , 18k), the internal storage fragmentation of instantiated multiple RAMs will be very large
A better method is to divide Block RAM resources into a relatively fixed area, such as 64 bytes (the shortest frame length of an Ethernet frame). The specific idea is that when a data frame applies for cache allocation, the queue management module will allocate it Divide into several 64-byte fragments for storage. When the length of the final fragment is less than 64 bytes, it will also occupy a complete fragment. That is, in the limit case, the unit stored in a data frame will have 63 The internal fragmentation of bytes, but if the fragmentation is divided into smaller storage areas in order to reduce internal fragmentation, this will not only increase the linked list overhead for managing these storage areas, but also make the steps of enqueuing to apply for cache areas longer, reducing The overall speed of the system
[0004] The difficulty in solving the above problems and defects is: if a queue management scheme based on fragmentation with random frame length is adopted, not only the fixed-length fragments of the entire storage area need to be connected in the form of a linked list, but additional management of these storage areas is required. The overhead of the area linked list, and the total number of fixed-length fragments occupied by the storage of a random frame is unpredictable, which leads to the need for multiple cache applications for a random frame that is about to enter the queue, which limits the exchange rate of the entire system. It requires not only complex queue management mechanism, but also complex cache management and cache query work to achieve the correct forwarding of data frames, which greatly increases the difficulty of the overall design

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Queue buffer management method, system, storage medium, computer equipment and application
  • Queue buffer management method, system, storage medium, computer equipment and application
  • Queue buffer management method, system, storage medium, computer equipment and application

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0074] In order to make the objectives, technical solutions and advantages of the present invention clearer, the present invention will be further described in detail below with reference to the embodiments. It should be understood that the specific embodiments described herein are only used to explain the present invention, but not to limit the present invention.

[0075] In view of the problems existing in the prior art, the present invention provides a queue cache management method, system, storage medium, computer device and application. The present invention is described in detail below with reference to the accompanying drawings.

[0076] like figure 1As shown, the queue cache management method provided by the present invention includes the following steps:

[0077] S101: Framing variable-length data frames to form a fixed-length frame with a fixed length of bytes, and initiates a request to the queue cache management module to apply for entry into the queue. Since the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of data exchange, and discloses a queue buffer management method, system, storage medium, computer equipment and application. Data frames after flow classification and packet processing are framed before entering the queue buffer management module, and the The variable-length Ethernet data frame is framed into a fixed-length frame with a fixed length of bytes, and the on-chip Block RAM is added with peripheral control to realize a configurable multi-channel FIFO queue to store the fixed-length frame. The present invention uses a whole block of BlockRAM to store fixed-length frames of different queues, and selects the storage area in a configurable way to present to the outside as a whole block of RAM or multiple FIFO queues, and selects a suitable storage scheme according to the storage area, which improves the efficiency of storage resources. Utilization, improve the processing and forwarding efficiency of data frames. The invention can avoid the generation of internal fragments as far as possible, improves the overall speed of the system, and greatly improves the utilization rate of storage resources.

Description

technical field [0001] The invention belongs to the technical field of data exchange, and in particular relates to a queue cache management method, system, storage medium, computer equipment and application. Background technique [0002] At present: in order to improve the exchange rate of data frames, on-chip storage resources are usually used to store data frames, but on-chip resources are very precious. Because the queue management module of the switching system has a large number of queues to be managed, in order to avoid data from different queues The frames will not be stored together out of order, which requires data frames of different queues (or even different priorities) to be stored in different RAM areas, while the on-chip storage resources are limited, and the on-chip Block RAM has a fixed specification (36K). , 18k), the internal memory fragmentation of instantiating multiple RAMs would be very large. A better method is to divide the Block RAM resource into a ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F15/78G06F16/901
CPCG06F15/781G06F15/7846G06F16/9024
Inventor 潘伟涛韩冰邱智亮高志凯熊子豪
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products