Unlock instant, AI-driven research and patent intelligence for your innovation.

Reinforcement Learning-Based Collaborative Caching Method for Ultra-dense Network Small Station Coding

An ultra-dense network and reinforcement learning technology, applied in the field of ultra-dense network small station coding collaborative caching, can solve problems such as caching decision-making cannot be well applied, change patterns cannot be mined, and file popularity cannot be tracked.

Active Publication Date: 2021-09-10
SOUTHEAST UNIV
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] From the perspective of the method of obtaining cache decision, the traditional cache technology is often modeled as an optimization problem to solve the cache decision. First, the modeling process is often based on the file popularity to obey a specific distribution. The file popularity in the actual network is is constantly changing, this method of solving optimization problems based on a specific distribution cannot track the constant changes in file popularity, so that the resulting cache decision cannot be well applied to the actual network; secondly, even if the file popularity is subject to the distribution Switching to instantaneous file popularity, once the file popularity is changed, the optimization problem needs to be run again, which brings huge network overhead. Moreover, the optimization problem of modeling is often NP-hard (Non-Polynomial hard) problem, and it is very difficult to solve it. Difficult; finally, because the cache problem itself is based on the file request behavior that has occurred in the network, the cache decision is made to prepare for the file request behavior that will occur, and the method of making a cache decision based on the traditional solution optimization problem cannot mine the file request in the network. patterns of change such that caching decisions are made that are not optimal for incoming file requests

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Reinforcement Learning-Based Collaborative Caching Method for Ultra-dense Network Small Station Coding
  • Reinforcement Learning-Based Collaborative Caching Method for Ultra-dense Network Small Station Coding
  • Reinforcement Learning-Based Collaborative Caching Method for Ultra-dense Network Small Station Coding

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0065] In the present invention, an ultra-dense network small station coding cooperative caching method based on reinforcement learning is given by taking the LTE-A system as an example:

[0066] Such as figure 1 described, including the following steps:

[0067] Step 1: collect network information, set parameters: collect macro station set M={1,2,…,M} in the network, small station set P={1,2,…,P}, file request set F= {1,2,...,F}, the number of small stations within the coverage of the mth macro station p m ,m∈M; Obtain small station cache space M, M is determined by the operator according to the network operation and hardware cost; the operator divides a day into T time slots according to the file request in the ultra-dense network, and sets each Time starting point of time slots, each time slot is divided into three phases: file transfer phase, information exchange phase and cache decision phase;

[0068] Step 2: Formulate a base station cooperative caching scheme based o...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention provides an ultra-dense network small station coding cooperative caching method based on reinforcement learning. The method includes the following steps: the first step: collecting network information and setting parameters; the second step: formulating a base station cooperative caching scheme based on MDS coding : Step 3: Formulate the base station cooperative transmission scheme; Step 4: Use MDP to describe the task of reinforcement learning; Step 5: Define the goal of reinforcement learning; Step 6: Update the Q table for decision-making; Step 7: Randomly set Initial state; etc., this method utilizes small station cooperative coding cache and cooperative transmission to provide services for users, through reinforcement learning to mine the transfer mode of file requests collected in the real network, and formulate the optimal caching strategy, as a data-driven The machine learning method does not require any assumptions about the prior distribution of the data, and is more suitable for practical systems. And through real-time interaction with the environment, time-varying file popularity can be tracked and the optimal caching strategy can be implemented without solving the NP-hard problem.

Description

technical field [0001] The invention belongs to the technical field of wireless network deployment in mobile communication, and in particular relates to an ultra-dense network small station encoding cooperative caching method based on reinforcement learning in a wireless communication system. Background technique [0002] In an ultra-dense network, small cells can improve the communication quality of users at the edge of the network, effectively improving spectrum efficiency and system throughput. With the rapid growth of network terminal data volume and increasingly stringent user service quality requirements, the marginalization of mobile networks has emerged as the times require. An effective method is edge storage, that is, caching files in small stations to reduce massive data transmission during peak hours, which can effectively reduce the load on the system's wireless backhaul link and improve user experience. How to make full use of the limited storage space to cach...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): H04L29/08H04W28/14
CPCH04L67/1097H04W28/14H04L67/108H04L67/06H04L67/568H04L67/5682
Inventor 潘志文高深刘楠尤肖虎
Owner SOUTHEAST UNIV
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More