Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-edge base station joint cache replacement method based on agent deep reinforcement learning

A technology of cache replacement and reinforcement learning, applied in the field of communication

Active Publication Date: 2021-09-14
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF9 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

During this process, all content transmission processes will generate certain system overhead, and if the edge base station wants to cache the content downloaded from the remote content server, additional cache overhead will be generated

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-edge base station joint cache replacement method based on agent deep reinforcement learning
  • Multi-edge base station joint cache replacement method based on agent deep reinforcement learning
  • Multi-edge base station joint cache replacement method based on agent deep reinforcement learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0047] In this example, if figure 1 As shown, in the edge caching system, when the user equipment sends a request to the edge base station, if the content requested by the user is cached in the base station that accepts the request, it will directly deliver the content to the user equipment; if the base station that accepts the request does not include the user The requested content is sent to other base stations in its neighborhood. If other base stations in the neighborhood cache the content requested by the user, it will transmit the content to the base station that accepts the request and then deliver it to the user equipment; if the base station that accepts the request If the base station and all base stations in its neighborhood do not cache the content requested by the user, it sends a request to the remote content server (the remote content server contains all the content), and the base station that accepts the request downloads the content requested by the user from t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-edge base station joint cache replacement method based on multi-agent deep reinforcement learning, which comprises the following steps: when an edge cache system runs, counting the number of user content requests received by all edge base stations through a network controller, and when the user content requests reach a certain number, enabling the network controller to extract popularity characteristics of the user content requests and detects whether the popularity characteristics change in real time, when the popularity characteristics do not change, enabling the edge base station to use the current agent to carry out cache replacement, and when the popularity characteristics change, emptying the historically arriving user content requests, and replacing the current intelligent agent with the LFU, retraining the intelligent agent decision network, and performing cache replacement decision through a new intelligent agent.

Description

technical field [0001] The invention belongs to the field of communication technology, and more specifically, relates to a multi-edge base station joint buffer replacement method based on agent deep reinforcement learning. Background technique [0002] Due to the proliferation of mobile devices and data-intensive applications, 5G and beyond mobile communication networks need to deliver content at ultra-high speed and low latency. Driven by this goal, edge caching, as a technology that can effectively reduce content transmission delay and network congestion, has attracted extensive attention from academia and industry. [0003] In the traditional content distribution network (Content Delivery Network, CDN) cache system, the common cache replacement algorithm has the least frequently used (Least Frequently Used, LFU) replacement algorithm, the least recently used (Least Recently Used, LRU) replacement algorithm And first-in-first-out (First In First Out, FIFO) replacement alg...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): H04L29/08H04W16/18H04W24/02G06N20/00
CPCH04W16/18H04W24/02G06N20/00H04L67/5682
Inventor 宋彤雨谈雪彬胡文昱董刘杨任婧王雄徐世中王晟
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products