Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Shared cache distribution method and device

A technology of shared cache and allocation method, applied in the field of service quality, can solve the problem of low cache utilization, and achieve the effect of increasing utilization, improving service quality, and reducing complexity

Inactive Publication Date: 2017-01-11
SANECHIPS TECH CO LTD
View PDF4 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In order to solve the existing technical problems, the embodiment of the present invention provides a shared cache allocation method and device, aiming to solve the problem of low cache utilization in the prior art

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Shared cache distribution method and device
  • Shared cache distribution method and device
  • Shared cache distribution method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0040] An embodiment of the present invention provides a shared cache allocation method. figure 1 It is a schematic flow chart of the shared cache allocation method in Embodiment 1 of the present invention; figure 1 As shown, the method includes:

[0041] Step 101: Pre-configure the shared cache space as static cache space and dynamic cache space.

[0042] The shared cache allocation method provided in this embodiment is applied to various network communication devices. Then in this step, the pre-configured shared cache space is a static cache space and a dynamic cache space, which is: the network communication device pre-configures a shared cache space as a static cache space and a dynamic cache space.

[0043] specific, figure 2 It is a schematic diagram of the application of the shared cache space of the embodiment of the present invention; as figure 2 As shown, the network communication device divides the shared cache space into static cache space and dynamic cache s...

Embodiment 2

[0070] The embodiment of the present invention also provides a shared buffer allocation method. Figure 4 It is a schematic flow chart of the method for allocating shared caches in Embodiment 2 of the present invention; Figure 4 As shown, the method includes:

[0071] Step 201: Configure static cache space and dynamic cache space.

[0072] In this embodiment, it is assumed that a group (two) of queues (respectively queue 0 and queue 1, wherein the priority of queue 0 is higher than the priority of queue 1) is used to allocate shared buffer space.

[0073]Here, it is assumed that the total capacity of the shared cache space is 64, the capacity of the static cache space is configured as 32, and the capacity of the dynamic cache space is 32. The priority interval for configuring the dynamic cache space is 16, that is, high priority can occupy up to 32 cache spaces in the dynamic cache space, and low priority can occupy up to 16 cache spaces in the dynamic cache space.

[0074...

Embodiment 3

[0092] The embodiment of the present invention also provides a shared cache allocation device, which can be applied to various network communication devices. Figure 5 It is a schematic diagram of the composition and structure of the shared cache allocation device in Embodiment 3 of the present invention, as shown in Figure 5 As shown, the device includes: a configuration unit 31, a first processing unit 32 and a second processing unit 33; wherein,

[0093] The configuration unit 31 is configured to pre-configure the shared cache space as a static cache space and a dynamic cache space;

[0094] The first processing unit 32 is configured to control the queue to initiate a dynamic cache space application when a queue joins and the storage space of the static cache space satisfies a first preset condition;

[0095] The second processing unit 33 is configured to determine that when the dynamic cache space application for the queue initiated by the first processing unit 32 satisf...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the present invention discloses a shared cache distribution method and device. The method comprises the steps of pre-configuring the shared cache space as the static cache space and the dynamic cache space; when a queue is added and the storage space of the static cache space satisfies a first preset condition, controlling the queue to initiate a dynamic cache space application; when the dynamic cache space application of the queue is determined to satisfy a second preset condition, distributing the cache space in the dynamic cache space to the queue according to an adjustment parameter pre-configured to the queue.

Description

technical field [0001] The present invention relates to the field of quality of service (QoS, Quality of service), in particular to a shared buffer allocation method and device. Background technique [0002] In the existing large-traffic multi-user data network, network congestion control technology must be used. Random early discarding technology is one of the network congestion control methods, and its purpose is to perform early discarding before the data overflows the buffer space, so as to avoid a large number of continuous packet loss caused by the buffer overflow. [0003] The principle of random early discarding is to predict the congestion of the cache space in advance by calculating the cache occupancy of the queue. At present, the shared cache management uses a multiplication algorithm to dynamically estimate the shared space (queue activation number multiplied by the current queue cache occupancy) to obtain the estimated value, and then compare and judge the est...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04L12/873H04L12/851H04L12/823H04L47/52H04L47/32
CPCH04L47/23H04L47/2433H04L47/522
Inventor 王莉
Owner SANECHIPS TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products