Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

An optimized cache system based on edge computing framework and its application

An edge computing and caching system technology, applied in transmission systems, wireless communications, electrical components, etc., can solve problems such as no cache life cycle considerations, resource waste, etc., to shorten network response time, reduce latency, and improve latency Effect

Active Publication Date: 2021-06-01
SHANDONG UNIV
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the patent has the following defects: the patent only considers the placement process of the cache content, but does not consider the entire life cycle of the cache
Not only that, but for the problem of cache placement, a single plan based on personal paths is likely to cause some main roads in traffic, or servers in areas where people concentrate activities are often overloaded and have to be expanded and upgraded , but at the same time, a large number of storage resources are idle, which makes resources waste

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • An optimized cache system based on edge computing framework and its application
  • An optimized cache system based on edge computing framework and its application
  • An optimized cache system based on edge computing framework and its application

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0065] An optimized caching system based on an edge computing framework, including multiple local area networks, all of which are connected to the cloud center through the Internet;

[0066] Each local area network is an independent edge node cluster. The edge node cluster includes routers, first-level switches, second-level switches and regional edge servers connected in sequence from top to bottom. The second-level switches are connected downward to local edge nodes and user terminal equipment;

[0067] The router connects upward to the Internet, and the first-level switch connects downward to the second-level switch and the regional edge server;

[0068] The router is used to connect the local area network and the Internet; the first-level switch and the second-level switch are used for the connection and networking within the local area network; the regional edge server is used to store the hot content of the entire local area network, and is responsible for the data label ...

Embodiment 2

[0072] According to an optimized caching system based on an edge computing framework described in Embodiment 1, the difference is that:

[0073] The regional edge server installs Redis as a database to store the data tags of cache resources and the reachability table of other edge node clusters nearby;

[0074]The data label is used to record the relevant information of the cache resource, including the URI of the cache resource, the file size, the number of visits, the storage path, whether it is a temporary cache, the generation time of the data label, and the number of visits includes the total number of visits in the entire LAN and each local edge The visit times of nodes, the total visit times threshold p0 in the whole local area network is 125% of the number of users in the whole local area network, the visit times threshold pi of each local edge node is the number of users in the network where the i-th local edge node is located; through the whole local area network The...

Embodiment 3

[0077] A cache placement and usage method of an optimized cache system based on an edge computing framework described in Embodiment 2, such as figure 2 As shown, the optimized caching system based on the edge computing framework is divided into two periods, peak working period and non-peak working period, including the following steps:

[0078] (1) Determine whether the current edge node cluster is in the peak period of work or not. If the current edge node cluster is in the peak period of work, the system is responsible for using the cache to provide services during the peak period of work, and enter step (4); otherwise, it is not During the peak period, the system is responsible for the placement of the cache, and enters step (2);

[0079] (2) Optimize the caching system to automatically poll the database, select the data tags whose access times reach the threshold, including the data tags whose total access times in the entire local area network reach the threshold p0 and ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention relates to an optimized caching system based on an edge computing framework and its application, including multiple local area networks, all of which are connected to the cloud center through the Internet; each local area network is an independent edge node cluster, and the edge node cluster includes from top to bottom A router, multiple switches, multiple regional edge servers, multiple local edge nodes, hosts or other storage devices connected in turn; the edge node cluster connects to the Internet through the router; the regional edge server is used to control multiple local edge node, and store the hot content of the entire local area network; the local edge node is used to store the local hot content; the optimized cache system based on the edge computing framework of the present invention is based on the traditional cloud-edge node-user terminal three-tier architecture, and in the edge node The distribution structure is optimized.

Description

technical field [0001] The invention relates to an optimized cache system based on an edge computing framework and an application thereof, belonging to the technical field of mobile communication. Background technique [0002] By 2021, the global mobile data traffic will reach 587EB, which is equivalent to 122 times that of 2011. The surge in mobile network traffic has put huge pressure on mobile backhaul links and tight bandwidth resources. [0003] In order to cope with the explosive growth of mobile network traffic, academia and industry have made many efforts, among which mobile edge computing and mobile edge caching are two extremely important aspects. There are many terminal devices on the edge of the mobile network, which have certain storage and computing capabilities. After the content is cached to the edge of the mobile network, users can obtain the content nearby, thereby avoiding the repeated transmission of content and relieving the pressure on the backhaul net...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): H04W24/02H04L29/08
CPCH04W24/02H04L67/568H04L67/63
Inventor 张海霞顿凯袁东风
Owner SHANDONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products