Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Data Selective Caching Method Based on Cooperative Caching

A selective, data-based technology, applied in wireless communication, transmission systems, electrical components, etc., can solve the problems of not considering other people's cache data, multi-user cache of the same hotspot data, etc., to achieve efficient use of memory capacity and maximize cellular traffic offloading Effect

Active Publication Date: 2021-02-05
SHANGHAI INST OF MICROSYSTEM & INFORMATION TECH CHINESE ACAD OF SCI
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, in this technology, users make their own caching choices independently, regardless of other people’s caching data, and it is easy to cause multiple users to cache the same hot data, which is localized

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Data Selective Caching Method Based on Cooperative Caching
  • A Data Selective Caching Method Based on Cooperative Caching
  • A Data Selective Caching Method Based on Cooperative Caching

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0039] Below in conjunction with the drawings, preferred embodiments of the present invention are given and described in detail.

[0040] Such as figure 1 Shown, the present invention, a kind of data selective caching method based on cooperative caching, it comprises the following steps:

[0041] Step S1, when the current user receives a request for each data from an adjacent user, or receives each data from an adjacent user or a base station, record and update the number of requests for each data;

[0042] Step S2, the current user predicts the probability that each data will be requested in the future according to the number of requests for each data in step S1, so as to obtain the predicted probability of each data;

[0043] Step S3, the current user inquires and collects the memory cache situation of adjacent users before caching each data, and defines the value of each data in combination with the size of each data and the predicted probability of each data in step S2; a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention relates to a data selective caching method based on cooperative caching, which includes the following steps: Step S1, when the current user receives a request for each data from an adjacent user, or receives each data from an adjacent user or a base station, record and update The number of requests for each data; step S2, the current user predicts the probability that each data will be requested in the future according to the number of requests for each data, so as to obtain the predicted probability of each data; step S3, the current user asks and collects neighboring users before caching each data The memory cache situation of the user, combined with the size of each data and the prediction probability of each data, defines the value of each data; and step S4, if the current user's memory is not full, cache the received data, otherwise, according to the size of each data As well as the value of each data, a greedy algorithm is used to determine whether to cache the received data to replace the original data in the memory. The invention efficiently utilizes the limited memory capacity of the terminal, and realizes the maximum cellular traffic unloading.

Description

technical field [0001] The invention relates to wireless communication technology, in particular to a data selective caching method based on cooperative caching. Background technique [0002] In recent years, with the increasing popularity of media services such as high-definition video, its high-traffic characteristics have led to an increasing shortage of spectrum resources, which has also brought enormous pressure on the core network of operators. With the rapid popularization of ultra-high-definition streaming video and various mobile smart terminal devices, the vast majority of data traffic will shift from fixed networks to wireless networks. According to the data released by the market research organization Juniper Research, by 2021, it is estimated that more than 90% of people will mainly obtain digital media information through mobile devices throughout the year, and smartphones and tablets will gradually replace personal computers and become the most important compu...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): H04L29/08H04W28/02
Inventor 谭冲虞新颖刘洪郑敏卜智勇
Owner SHANGHAI INST OF MICROSYSTEM & INFORMATION TECH CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products