Active content caching method based on federated learning

A content caching and federation technology, applied in integrated learning, instrumentation, digital data processing, etc., can solve the problems of data being attacked or intercepted, the difficulty of collecting historical demand data, and the difficulty of large-scale learning.

Active Publication Date: 2020-10-30
DALIAN UNIV OF TECH
View PDF7 Cites 19 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although these two types of centralized learning methods improve the cache efficiency, there are two problems: first, in wireless communication networks, data is generated by billions of devices
For such large-scale data, if you want to maintain the efficiency of the algorithm, you need to rely on a powerful central machine learning processor, and at the same time face huge communication transmission overhead, which makes large-scale learning difficult to achieve in reality.
Second, because the user's historical demand will involve the user's privacy in most cases, the user is unwilling to share the data containing their own privacy. Therefore, the user's distrust of the server makes it very difficult to collect historical demand data
At this time, data sharing depends on the intermediate medium. However, even if encrypted transmission is adopted, the possibility of external data leakage will still increase because the data is likely to be attacked or intercepted during the transmission process.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Active content caching method based on federated learning
  • Active content caching method based on federated learning
  • Active content caching method based on federated learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0072] The present invention will be further described below in conjunction with specific examples.

[0073] Taking the dataset Movielens as an example, the Movielens 100K dataset contains 100,000 ratings for 1,682 movies by 943 users. Each dataset entry consists of a user ID, a movie ID, a rating, and a timestamp. In addition, it provides the user's demographic information, such as gender, age, and occupation. Because users usually rate movies after watching them, we assume that movies represent files requested by users, and popular movie files are files that need to be cached in the edge server base station.

[0074] An active content caching method based on federated learning, comprising the following steps:

[0075] Step 1: Information collection and model building

[0076] Step 1.1 Collect information: According to the type of information, the process of collecting information by the edge server base station mainly includes two aspects:

[0077] 1) The access request ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an active content caching method based on federated learning, and belongs to the technical field of wireless communication. The method comprises steps that firstly, in each round of communication, a user downloads a global model, training is conducted locally through a stacked automatic encoder, and a local model and implicit features of the user and a file are obtained; secondly, in each round of communication, the user updates the model and sends the model to a server, and all the local models are aggregated to generate a global model; thirdly, after training is finished, the user sends the implicit features of the user and the file to the server, the server firstly calculates the user similarity and the file similarity, then a certain user is randomly selected, and a decoder of the stacked automatic encoder is used for recovering pseudo score matrixes of the user and the file; and finally, scores of the group of users on all the files are calculated by usingcollaborative filtering, and the file with the highest average score is selected for caching. On the premise of ensuring the cache hit rate, a problem of data sharing between neighbor users is effectively avoided, so private data of the user is safer.

Description

technical field [0001] The invention belongs to the technical field of wireless communication, and relates to an active content caching method based on federated learning. Background technique [0002] At present, mobile data is facing explosive growth, the total amount of data is large, and the search and transmission time for data is long. Therefore, it is necessary to filter data and make useful data close to the user side to achieve fast data access. Wireless network content caching technology emerges at the historic moment, it is very helpful to reduce backhaul traffic load and reduce service delay of mobile users under the background of the surge of mobile data traffic. Since the capacity of content caching devices is limited, it is important to predict which files are worth caching. However, most traditional content caching algorithms are passive, and only respond to the access requests that have occurred, without considering the popularity of future content, such as...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): H04L12/861H04L29/08G06N20/20G06F21/62
CPCH04L49/9063H04L67/10G06N20/20G06F21/6245
Inventor 邓娜王凯伦
Owner DALIAN UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products