Check patentability & draft patents in minutes with Patsnap Eureka AI!

Flow-aware switch shared cache scheduling method and device

A shared cache, traffic-aware technology, applied in data exchange networks, digital transmission systems, electrical components, etc., can solve the problems of inability to use sufficient cache for burst traffic of cache resources, lack of port traffic-aware capability, etc., to improve usage efficiency Effect

Pending Publication Date: 2021-10-15
TSINGHUA UNIV
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Due to the lack of port traffic awareness in existing dynamic threshold schemes, long-term traffic that does not need to use cache resources occupies the cache for a long time, resulting in insufficient cache for burst traffic that requires cache resources

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Flow-aware switch shared cache scheduling method and device
  • Flow-aware switch shared cache scheduling method and device
  • Flow-aware switch shared cache scheduling method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] Embodiments of the present invention are described in detail below, examples of which are shown in the drawings, wherein the same or similar reference numerals designate the same or similar elements or elements having the same or similar functions throughout. The embodiments described below by referring to the figures are exemplary and are intended to explain the present invention and should not be construed as limiting the present invention.

[0028] The flow-aware switch shared buffer scheduling method and device proposed according to the embodiments of the present invention are described below with reference to the accompanying drawings.

[0029] Firstly, a flow-aware switch shared cache scheduling method proposed according to an embodiment of the present invention will be described with reference to the accompanying drawings.

[0030] figure 1 It is a flowchart of a flow-aware switch shared cache scheduling method according to an embodiment of the present invention...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a flow-aware switch shared cache scheduling method and device. The method comprises the following steps: in the operation process of the switch, monitoring whether events of data packet queue entering, data packet queue exiting, data packet loss, cache overflow and queue state change of each output port of the switch occur or not in real time; judging that the flow state of the port is light load, heavy load or overload according to the port event; determining the control state of the current port to be a common state, an absorption state or an emptying state based on the port flow state; and based on the port control state, adjusting the cache threshold of the port of the switch, and changing the upper limit of the cache which can be used by the port. Through real-time monitoring of port traffic, different ports can be subjected to differential management according to requirements, so the use efficiency of the shared cache of the switch is effectively improved.

Description

technical field [0001] The present invention relates to the technical field of switch shared cache, in particular to a flow-aware switch shared cache scheduling method and device. Background technique [0002] In computer networking, switch buffers are used to absorb uneven traffic arriving at switch ports. In order to improve the efficiency of cache usage, currently commonly used commercial switches usually use on-chip shared memory (on-chip shared memory). All outgoing ports of the same switch share a unified buffer area, and different ports perform statistical multiplexing on the shared buffer area. Because all ports share the cache, when some ports of the switch are heavily loaded, some ports may occupy the entire cache while other ports cannot use the cache, resulting in unfairness among ports. In order to avoid the above unfairness, it is necessary to The shared cache is managed through the shared cache scheduling policy. [0003] The current mainstream shared cache ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04L12/861H04L12/935H04L49/111
CPCH04L49/90H04L49/30H04L47/122H04L47/125H04L43/16H04L43/0817H04L43/0835H04L43/0841H04L43/0847H04L43/0894
Inventor 崔勇黄思江王莫为
Owner TSINGHUA UNIV
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More