A base station caching method under the time-varying content popularity

A time-varying and popular technology, applied in network traffic/resource management, wireless communication, electrical components, etc., can solve problems such as performance needs to be improved, improve user satisfaction, increase cache hit rate, and ease backhaul links The effect of load

Active Publication Date: 2021-06-11
SOUTHEAST UNIV
View PDF10 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, most of the current caching research is to study the caching performance when the content popularity is known and does not change with time. When the content popularity changes with time, the performance needs to be improved.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A base station caching method under the time-varying content popularity
  • A base station caching method under the time-varying content popularity
  • A base station caching method under the time-varying content popularity

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0022] The technical solution of the invention will be further introduced below in conjunction with the accompanying drawings and specific embodiments.

[0023] Consider a macro cellular network that deploys N helpers, and the Helper uses a collection of express. like figure 1 As shown, each helper is connected to the macro base station through a reliable backhaul link, while providing high-speed data services to the users it serves. Assuming that each helper has a fixed cache capacity M, the content controller in the macro base station determines the cached content of each helper according to the cache policy. We divide the time into time slots, each time slot contains a user request phase and a cache placement phase. In the user request phase, the user served by the helper requests content. If the requested content is stored in the helper, the helper processes the request and quickly transmits the content to the user without causing a load on the macro cellular network. ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a base station caching method under the condition that content popularity varies with time. In each time period, according to the instantaneous request frequency of the cached content, the multi-armed gambling machine is used to estimate the popularity of the content, and at the same time, the newly generated content is stored in these helpers in sequence. There are differences in the request frequency of each helper. The latent semantic model is used to estimate the frequency of user requests for new content under each helper, and the content controller updates the cached content according to the estimated content popularity. The invention can estimate the popularity of content online in real time, and cache the content with high popularity in real time, thereby improving the cache hit rate, improving user satisfaction and backhaul link load.

Description

technical field [0001] The invention relates to the technical field of mobile communication systems, in particular to a base station caching method when content popularity changes with time. Background technique [0002] In order to cope with the challenge of system capacity brought about by massive data growth, an effective solution is to deploy helpers around macro base stations, where helpers have cache capacity and can cache content. When the content requested by the user is stored in the cache of the helper, the content is directly transmitted from the helper to the user, which does not occupy backhaul link resources, reduces transmission delay, and improves user experience. If the content requested by the user is not in the helper, the request is sent to the macro base station and the requested content is downloaded from the macro base station. The active storage of the base station is to store the content in the helper before the request arrives, which can reduce the...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): H04W28/14
CPCH04W28/14
Inventor 刘楠牛岩潘志文尤肖虎
Owner SOUTHEAST UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products