Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Calculation unloading and resource allocation method and device based on deep reinforcement learning

A technology of reinforcement learning and computing offloading, applied in the field of mobile communication, can solve the problem of high energy consumption of UE and achieve the effect of reducing energy consumption

Pending Publication Date: 2020-07-10
CHINA THREE GORGES UNIV
View PDF7 Cites 45 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004]Existing solutions only focus on the performance of quasi-static systems, and ignore the impact of different resource requirements and limited resource capacity on the performance of MEC systems. Practical network applications However, there is still a technical problem of excessive UE energy consumption

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Calculation unloading and resource allocation method and device based on deep reinforcement learning
  • Calculation unloading and resource allocation method and device based on deep reinforcement learning
  • Calculation unloading and resource allocation method and device based on deep reinforcement learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0037] In order to make the purpose, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the drawings in the embodiments of the present invention. Obviously, the described embodiments It is a part of embodiments of the present invention, but not all embodiments. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts fall within the protection scope of the present invention.

[0038] With the emergence of many emerging wireless services in 5G networks, mobile applications, especially more and more computing-intensive tasks, such as online interactive games, face recognition, and augmented / virtual reality (AR / VR), have led to The unprecedented explosive growth of data traffic. Generally, these emerging applicatio...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a calculation unloading and resource allocation method and device based on deep reinforcement learning, and the method comprises the steps: calculating the total calculation resources of an MEC server based on the calculation task parameters of UE, the performance parameters of the UE, the channel parameters between the UE and an AP, and the mobile edge, and constructing anoptimization problem model; and determining the optimal solution of the optimization problem model based on deep reinforcement learning, determining the unloading decision of the UE, and respectivelyallocating the percentage of the computing resources and the percentage of the spectrum resources to the UE. According to the calculation unloading and resource allocation method and device based on deep reinforcement learning provided by the invention, the actual calculation unloading and resource allocation characteristics in the time-varying MEC system are considered, the time delay threshold of the task and the limited resource capacity constraint of the system are based on deep reinforcement learning, the DNN is used for effectively approximating a value function in reinforcement learningso as to determine a joint optimal scheme of calculation unloading and resource allocation, and the energy consumption of the UE is further reduced.

Description

technical field [0001] The present invention relates to the technical field of mobile communication, in particular to a method and device for computing offloading and resource allocation based on deep reinforcement learning. Background technique [0002] In order to alleviate the increasingly severe conflict between application requirements and resource-constrained User Equipment (UE), considering that the computing power and storage capacity of cloud servers deployed in Mobile Cloud Computing (Mobile Cloud Computing, MCC) are obviously higher than UE, which prompts the emergence of MCC as an effective solution. However, MCC technology inevitably faces the problem that the deployed cloud server is far away from the user equipment, which may cause additional transmission energy overhead when the user equipment transmits data to the cloud server. In addition, long-distance transmission cannot guarantee the Quality of Service (QoS) of delay-sensitive applications. [0003] In...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): H04W16/10H04W16/22
CPCH04W16/10H04W16/22Y02D30/70
Inventor 周欢江恺冯阳
Owner CHINA THREE GORGES UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products