Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

An edge collaborative caching method for load balancing of differentiated services in Internet scenarios

A technology of differentiated services and cooperative caching, applied in transmission systems, electrical components, etc., can solve problems such as poor user experience and large average access delay of user requests, and achieve the goal of reducing queuing delay, improving user experience, and improving stability. Effect

Active Publication Date: 2021-11-26
SUN YAT SEN UNIV
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The present invention overcomes the Internet-based online request service and multi-node cooperative edge cache scenarios in the above-mentioned prior art, and aims at the request queuing delay caused by user differentiated service requirements and edge server load imbalance, which leads to a large average access delay of user requests, and users To solve the problem of poor experience, provide an edge collaborative caching method for load balancing of differentiated services in Internet scenarios

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • An edge collaborative caching method for load balancing of differentiated services in Internet scenarios
  • An edge collaborative caching method for load balancing of differentiated services in Internet scenarios
  • An edge collaborative caching method for load balancing of differentiated services in Internet scenarios

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0080] like figure 1 As shown, the present invention adopts an edge cache system based on edge-cloud collaboration and edge-edge collaboration modes. In the edge cache system, a collaborative cache strategy for Internet service applications in edge nodes is studied. It is assumed that the cloud data center owns / configures all Internet service applications. Due to the limited storage capacity of edge nodes, only source files (or application installation packages) can be downloaded / obtained from the cloud data center and then installed and configured in the nodes. Usually, due to the limited capacity of the edge node, after installing a new service application, the edge node will discard the source file (or application installation package).

[0081] like figure 2 As shown in , an edge collaborative caching method for load balancing of differentiated services in Internet scenarios includes the following steps:

[0082] S1: Define the response actions and cache parameters made...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an edge collaborative caching method for load balancing of differentiated services in Internet scenarios, comprising the following steps: S1: defining response actions and caching parameters made by edge nodes in the edge collaborative caching system after a user sends an application service request ; S2: Parameter initialization, execute the edge collaborative caching process, and invoke the load balancing strategy and differentiated service strategy. The present invention satisfies the requirements of different service levels of different users in the Internet scene by adopting differentiated service strategies in the edge cooperative cache system, reduces the queuing delay of user requests through the load balancing strategy, and improves the stability of node response request delays , which improves the user experience.

Description

technical field [0001] The present invention relates to the technical field of edge collaborative caching, and more specifically, to an edge collaborative caching method for load balancing of differentiated services in Internet scenarios. Background technique [0002] In the era of big data, with the explosive growth of Internet data, IDC predicts that more than 50 billion devices will be connected to the Internet by 2020, and Internet data will reach 44ZB, 70% of which will need to be processed in edge devices. In addition, a large number of Internet users frequently request / acquire content from the cloud, which puts a huge load pressure on the server of the network service provider SP. In the peak period of network data transmission in the background of big data, traditional cloud computing technology is difficult to meet the QoS and QoE due to the huge load pressure on the cloud server. [0003] A large number of studies have shown that edge caching sinks files or servic...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): H04L29/08
CPCH04L67/1097H04L67/1008H04L67/1001
Inventor 刘芳张振源蔡振华苏屹宏黄志杰
Owner SUN YAT SEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products