Load balancing edge cooperative caching method for Internet scene differentiated service

A technology of differentiated services and cooperative caching, applied in electrical components, transmission systems, etc., can solve problems such as poor user experience and large average access delay of user requests

Active Publication Date: 2020-12-04
SUN YAT SEN UNIV
View PDF6 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The present invention overcomes the Internet-based online request service and multi-node cooperative edge cache scenarios in the above-mentioned prior art, and aims at the request queuing delay caused by user differentiated service requirements and

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Load balancing edge cooperative caching method for Internet scene differentiated service
  • Load balancing edge cooperative caching method for Internet scene differentiated service
  • Load balancing edge cooperative caching method for Internet scene differentiated service

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0080] Such as figure 1 As shown, the present invention adopts an edge cache system based on edge-cloud collaboration and edge-edge collaboration modes. In the edge cache system, a collaborative cache strategy for Internet service applications in edge nodes is studied. It is assumed that the cloud data center owns / configures all Internet service applications. Due to the limited storage capacity of edge nodes, only source files (or application installation packages) can be downloaded / obtained from the cloud data center and then installed and configured in the nodes. Usually, due to the limited capacity of the edge node, after installing a new service application, the edge node will discard the source file (or application installation package).

[0081] Such as figure 2 As shown in , an edge collaborative caching method for load balancing of differentiated services in Internet scenarios includes the following steps:

[0082] S1: Define the response actions and cache parameter...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a load balancing edge cooperative caching method for an Internet scene differentiated service. The method comprises the following steps: S1, defining response actions and caching parameters made by edge nodes in an edge cooperative caching system after a user sends an application service request; and S2, initializing parameters, executing an edge cooperative caching process, and calling a load balancing strategy and a differentiated service strategy. The differentiated service strategy is adopted in the edge cooperative caching system to meet the requirements of different service levels of different users in an Internet scene, queuing delay of user requests is reduced through the load balancing strategy, stability of node response request delay is improved, and userexperience is improved.

Description

technical field [0001] The present invention relates to the technical field of edge collaborative caching, and more specifically, to an edge collaborative caching method for load balancing of differentiated services in Internet scenarios. Background technique [0002] In the era of big data, with the explosive growth of Internet data, IDC predicts that more than 50 billion devices will be connected to the Internet by 2020, and Internet data will reach 44ZB, 70% of which will need to be processed in edge devices. In addition, a large number of Internet users frequently request / acquire content from the cloud, which puts a huge load pressure on the server of the network service provider SP. In the peak period of network data transmission in the background of big data, traditional cloud computing technology is difficult to meet the QoS and QoE due to the huge load pressure on the cloud server. [0003] A large number of studies have shown that edge caching sinks files or servic...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): H04L29/08
CPCH04L67/1097H04L67/1008H04L67/1001
Inventor 刘芳张振源蔡振华苏屹宏黄志杰
Owner SUN YAT SEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products