Learning-based low-delay task scheduling method in edge computing network

An edge computing and task scheduling technology, applied in the field of mobile computing, which can solve problems such as difficult design, heuristic algorithm environment changes, etc.

Active Publication Date: 2019-07-05
CENT SOUTH UNIV
View PDF3 Cites 56 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In view of the existing task scheduling methods, the heuristic algorithm is easily affected by environmental changes and difficult to design. It is planned to use reinforcement learning technology to design task scheduling schemes

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Learning-based low-delay task scheduling method in edge computing network
  • Learning-based low-delay task scheduling method in edge computing network
  • Learning-based low-delay task scheduling method in edge computing network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0032] This embodiment discloses a learning-based low-latency task scheduling method in an edge computing network. Mobile smart terminals held by multiple users communicate with a multi-resource server (EC server) cluster in the edge computing network through a wireless access point. Connection, only keep the system state of N tasks arriving at each time, and put the task information other than N in the backlog part, only count the number of tasks, and schedule N tasks at each time step, allowing the agent Agent Execute multiple actions a at each time step. At each time step t, time is frozen until an invalid action is selected or an inappropriate task is attempted to be scheduled. The time will not proceed. The cluster image moves one step, each time a The time step is equivalent to the agent making an effective decision, and then the agent observes the state transition, that is, the task is scheduled to the appropriate position in the cluster image; the reward is set at each ...

Embodiment 2

[0053] Such as figure 1 shown. Mobile smart terminals held by multiple users are connected to a server (EC server) cluster in the edge computing network through a wireless access point, and the EC server cluster is a multi-resource cluster. Tasks dynamically arrive at the edge server cluster online, and once a task is scheduled, it cannot be preempted. We assume an edge server cluster with three types of resources (CPU, memory, I / O), tasks generated by mobile smart terminals arrive at the edge network server cluster online at discrete time steps, and select a or multiple tasks for scheduling. The resource requirements of each task are assumed to be known upon arrival. For a smart mobile terminal i, the task it generates is denoted as A i =(d i ,c i ,r i ), where d i , representing task A i The data size of c i Indicates the completion of task A i The total number of CPU cycles required, r i Indicates task A i Required IO resources

[0054] This paper hopes to min...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a learning-based low-delay task scheduling method in an edge computing network, and aims to solve the problem that a heuristic algorithm is easily influenced by environmental changes and is difficult to design in an existing task scheduling method, and a task scheduling scheme is designed by using a reinforcement learning technology. Decisions made by the system in resourcemanagement are generally highly repeated, so that a large amount of training data can be generated for an RL algorithm. Next, the RL may model the decision policy of the complex system into a deep neural network. And through continuous interactive learning with the environment, a specific target (minimum delay) can be optimized.

Description

technical field [0001] The invention relates to the technical field of mobile computing, in particular to a learning-based low-latency task scheduling method in an edge computing network. Background technique [0002] In recent years, with the development of information technology, mobile smart devices have shown an explosive growth trend, and at the same time stimulated the emergence of many new applications, such as virtual reality, augmented reality, mobile interactive games and so on. And users are very sensitive to the delay of these interactive applications / services. Edge computing is a new type of distributed computing architecture that aims to transfer the control of computing applications, data and services from some central nodes ("core") of the Internet to another logical extreme ("edge"), near mobile smart devices and end users. Offloading the tasks of mobile smart devices to the edge nodes of the network can effectively solve the delay problem, and a reasonabl...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/50
CPCG06F9/5038
Inventor 孙子惠邓晓衡罗杰
Owner CENT SOUTH UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products