Federal learning computing unloading computing system and method based on cloud side end

It is a computing offloading and edge-end technology, which is applied in computing, program control design, program loading/starting, etc. It can solve the problem of limited wireless and computing capabilities, unloading tasks, data volume, calculation offloading, difficulty in making accurate decisions, and inability to provide services on terminals. And other issues

Inactive Publication Date: 2021-05-18
XI AN JIAOTONG UNIV
View PDF1 Cites 36 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although edge servers can provide cloud functions to terminals, they may not be able to serve all terminals due to their inherent wireless and limited computing capabilities
On the one hand, the uncertain data volume of offloading tasks and time-varying channel conditions make it difficult to make accurate decisions for computing offloading.
On the other hand, in the distributed heterogeneous edge infrastructure, there is a risk of interception and leakage of user personal sensitive information during the offloading process.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Federal learning computing unloading computing system and method based on cloud side end
  • Federal learning computing unloading computing system and method based on cloud side end
  • Federal learning computing unloading computing system and method based on cloud side end

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0091] Firstly, a BiLSTM model is trained on each local device (client) using historical unloading tasks, and a global model is formed by aggregation on the edge server and cloud server; when a new unloading task of the next task arrives, the global model formed by aggregation is used for the task Forecasting, the predicted output is used as a guide for computing offloading decision-making and resource allocation. During the training process, we compress and upload the gradient data each time through the data sparse method, thereby greatly reducing communication overhead, speeding up model convergence and computing decision-making. The complexity of resource allocation.

[0092] The present invention establishes a complete set of model training prediction to communication optimization method, and finally can quickly solve calculation unloading and resource allocation. The framework we consider can correspond to a static IoT network under the current 5G-driven MEC network, wher...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a federated learning computing unloading resource allocation system and a method based on a cloud side end, and aims to make an accurate decision for computing task unloading and resource allocation, eliminate the need for solving a combinatorial optimization problem and greatly reduce the computing complexity. Based on cloud side three-layer federated learning, the adjacent advantage of edge nodes to a terminal is comprehensively utilized, core powerful computing resources in cloud computing are also utilized, the problem that the computing resources of the edge nodes are insufficient is solved, a local model is trained at each of multiple clients to predict an unloading task. A global model is formed by periodically executing one-time parameter aggregation at an edge end, the cloud end executes one-time aggregation after the edge executes the periodic aggregation until a global BiLSTM model is formed through convergence, and the global model can intelligently predict the information amount of each unloading task. Therefore, guidance is better provided for calculation unloading and resource allocation.

Description

technical field [0001] The invention relates to computing offloading and resource allocation of a mobile edge computing network driven by a 5G network, and in particular to a cloud-edge-based federated learning computing offloading computing system and method. Background technique [0002] In recent years, driven by the popularity of the Internet of Things, the data generated at the edge of the network has exploded. The inability to guarantee low latency and location awareness cripples the capabilities of traditional cloud computing solutions. According to IDC's forecast, by the end of 2020, more than 50 billion terminals and devices will be connected to the Internet, and more than 50% of the data needs to be analyzed, processed and stored at the edge of the network. However, the traditional "cloud two-body collaborative computing" model can no longer meet the needs of low latency and high bandwidth. Mobile Edge Computing (MEC) is becoming a new and compelling computing pa...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/445G06F9/48G06F9/50G06F9/54
CPCG06F9/44594G06F9/485G06F9/5027G06F9/542G06F2209/509
Inventor 伍卫国张祥俊柴玉香杨诗园王雄
Owner XI AN JIAOTONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products