Method and device for allocating and controlling buffer space of multiple queues

A cache space and multi-queue technology, applied in the field of allocation and control of multi-queue cache space, can solve the problems of low cache utilization, complex algorithm, lack of adaptability, etc., to improve utilization, save hardware resources, and The effect that is beneficial to hardware practice

Active Publication Date: 2019-02-26
SANECHIPS TECH CO LTD
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, there are unavoidable defects in the above method:
[0005] Defect 1. It is relatively simple to use the above method to divide the cache. However, since the cache of each queue is pre-allocated and fixed after allocation, it cannot be automatically adjusted according to the real-time network traffic of each queue. The cache utilization rate is low and lacks self-adaptation performance, and cannot really achieve the effect of shared cache, because after allocation, the maximum amount of cache that each queue can use is the part allocated to itself;
[0006] Defect 2. Using the above method 2 to share the cache, when the activation queue is small, the cache utilization rate is low, and the algorithm is relatively complicated, which is not conducive to hardware implementation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and device for allocating and controlling buffer space of multiple queues
  • Method and device for allocating and controlling buffer space of multiple queues
  • Method and device for allocating and controlling buffer space of multiple queues

Examples

Experimental program
Comparison scheme
Effect test

no. 1 example

[0056] Based on the first embodiment above, step S13 includes:

[0057] Step S14, reserving a cache space with a preset value in the cache space to be shared, and using the reserved cache space as the cache space to be allocated;

[0058] Step S15 , according to the determined number of queues that need to allocate the cache space to be shared, allocate the cache space to be allocated to each queue that needs to allocate the cache space to be shared.

[0059] Specifically, obtain the size of the cache space to be shared, that is, the size of the shared cache space allocated for multiple queues, for example, it can be 100M, or any other preset shared cache space; Cache space, the preset value can be 10M, or 15M or 30M and other shared cache space set in advance, and the remaining cache space after reservation is used as the cache space to be allocated; allocate the cache to be shared according to the determined needs The number of queues for the space, and allocate the cache s...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a method and a device for allocating and controlling cache space of multiple queues. The invention allocates a share for each queue that needs to allocate cache space to be shared by determining the number of queues that need to allocate cache space to be shared. Cache space, and as the number of queues that need to be allocated cache space to be shared changes, allocate shared cache space for each queue that needs to be allocated cache space to be shared, realizing automatic adjustment and allocation of cache space according to network traffic, saving Hardware resources are conducive to hardware practice, and at the same time, the utilization rate of cache space is improved.

Description

technical field [0001] The present invention relates to the field of buffering, in particular to a method and device for allocating and controlling the buffering space of multiple queues. Background technique [0002] With the popularity of the Internet, information exchange and information sharing have become an indispensable part of people's daily life. With the continuous growth of interactive information (data packets) in the network, it will inevitably cause network congestion. Therefore, how to avoid congestion is particularly important. The congestion avoidance mechanism widely used on the Internet (Internet) is the RED (Random Early Discard) mechanism. The key to the RED mechanism is how to effectively use limited cache resources and perform reasonable discarding, so as to avoid congestion and ensure a smooth network. [0003] In the existing multi-queue cache of the RED device using the RED mechanism, the usual ways of allocating the cache include: method 1, which...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): H04L12/801H04L29/06
CPCH04L67/568
Inventor 陈杭洲
Owner SANECHIPS TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products