Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Attention mechanism reinforcement learning-based edge network device caching method

An edge network and reinforcement learning technology, applied in the direction of network traffic/resource management, advanced technology, climate sustainability, etc., can solve problems such as unconsidered dynamic interaction, improve network service quality and user experience quality, and reduce repeated downloads The number of files, the effect of optimizing the transfer

Active Publication Date: 2021-09-17
TIANJIN UNIV
View PDF1 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Aiming at the technical problem that the traditional cache replacement strategy does not consider dynamic interaction, this invention proposes an edge network device caching method based on attention mechanism reinforcement learning, which solves the multi-agent edge problem by introducing the attention mechanism into the actor-evaluator algorithm. caching problem

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Attention mechanism reinforcement learning-based edge network device caching method
  • Attention mechanism reinforcement learning-based edge network device caching method
  • Attention mechanism reinforcement learning-based edge network device caching method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0074] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0075] Attention Mechanism is a data processing method in machine learning, which is widely used in various types of machine learning tasks such as natural language processing, image recognition and speech recognition. In fact, this mechanism is to apply human perception and attention behavior to machines, so that machines can learn to perceive important and unimportant parts of data. When applying, adjust the attention direction and weighting model according ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an attention mechanism reinforcement learning-based edge network device caching method, which comprises the following steps that a cellular network model is established, the cellular network model comprises user equipment, edge network devices and a core network, and each edge network device is internally provided with an action network module and an evaluation network module; the edge network devices receive a request sent by the user equipment in the area where the edge network device is located; each edge network device obtains observed values of other edge network devices; each edge network device selects an action according to a cache replacement strategy and an observation value; the edge network device sends the action and the updated state to the adjacent edge network device; parameters of the action network module and the evaluation network module are updated according to the updated observation value of the action and the action value function; and the cache replacement strategy is optimized according to the target function. According to the invention, the frequency of repeatedly downloading the file from the cloud data center can be reduced, the delay is reduced, and the network service quality and the user experience quality are improved.

Description

technical field [0001] The invention relates to the technical field of edge caching and deep reinforcement learning, in particular to an edge network device caching method based on attention mechanism reinforcement learning. Background technique [0002] With the development of network technology and the surge of demand, the speed and throughput of data and applications are causing the rapid growth of traffic. This challenge has also promoted the urgent revolution of network architecture and advanced communication technology. Mobile Edge Computing (MEC) technology can effectively alleviate the traffic pressure of mobile network operators. By storing content on base stations or local devices close to users, it can effectively reduce redundant data transmission delays in application services and improve service quality. [0003] In real life, users are in various scenarios. Due to the different services provided in different scenarios, the content cached by each base station...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04W24/06H04W28/14
CPCH04W28/14H04W24/06Y02D30/70
Inventor 王晓飞贾博森赵益尉李瑞斌王晨阳
Owner TIANJIN UNIV
Features
  • Generate Ideas
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More