Check patentability & draft patents in minutes with Patsnap Eureka AI!

A low-latency task scheduling method for dynamic fog computing network

A task scheduling and fog computing technology, applied in transmission systems, electrical components, etc., can solve problems such as affecting the use time of equipment, and achieve the effect of extending the use time and reducing energy overhead.

Active Publication Date: 2021-09-07
SHANGHAI TECH UNIV
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Continuously broadcasting and monitoring node status information in real time will generate a large amount of energy consumption and affect the service life of the equipment. In the future ultra-large-scale system, this problem will become more prominent
[0006] In the actual situation, nodes will dynamically enter and exit the network, and the shareable computing resources will also change with the situation. At the same time, it is hoped to minimize the energy consumption in the network and prolong the service life of the equipment. Such a problem that is most close to the actual demand and needs to be solved , but few scholars have studied

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A low-latency task scheduling method for dynamic fog computing network
  • A low-latency task scheduling method for dynamic fog computing network
  • A low-latency task scheduling method for dynamic fog computing network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] Below in conjunction with specific embodiment, further illustrate the present invention.

[0031] figure 1The flow chart of the low-latency task scheduling method oriented to the dynamic fog computing network provided in this embodiment, the "dynamic" oriented to the dynamic fog computing network includes the following three meanings: (1) The movement state of the nodes in the network is variable ; (2) The size of the network is variable: nodes can freely enter and exit the network; (3) The computing resources that help nodes can provide can be changed.

[0032] As an example, this embodiment provides a system block diagram such as figure 2 shown. Assume that at time t, there are N (N is a positive integer) candidate helper nodes within the communication range of the task node. Since the nodes are movable, the nodes within the communication range of the task node will change, such as figure 2 , the node in the upper left corner may enter the communication range of...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention provides a low-latency task scheduling method for a dynamic fog computing network. The helper nodes in the network will not broadcast their own node status information in real time, such as task queue information and shareable computing resource information, or the helper nodes will not broadcast their own node status information in real time. Requests for this information will not be responded to in real time. Every time there is a task unloading requirement, the task node needs to make an unloading decision in real time, and select an unloading task from the current candidate helper nodes. Since the task node is unknown to the state of the helper node, and the task itself also has a delay requirement, the task node needs to learn from its past task offloading experience to provide judgment for the current decision. The present invention proposes a one-to-multiple task offloading algorithm based on an online learning method for a non-dynamic and dynamically changing fog computing or edge computing network, which can greatly reduce the energy overhead caused by information dissemination in the network and prolong the task nodes and the duration of use of the helper node.

Description

technical field [0001] The invention belongs to the field of computing and communication networks, and in particular relates to a task unloading algorithm aiming at reducing the average unloading time delay of tasks. Background technique [0002] With the rise and development of technologies such as the intelligent Internet of Things, 5G, and artificial intelligence, processing massive and diverse data and meeting ultra-low service latency requirements have become increasingly urgent problems to be solved. The traditional centralized cloud computing architecture has a large delay due to the long distance between the terminal device and the cloud server. access. In this context, fog computing came into being. With its distributed framework and low-latency services, fog computing is expected to become a key technology supporting future intelligent IoT, 5G and artificial intelligence applications, and has received extensive attention and research in recent years. Fog computi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): H04L29/08
Inventor 谭友钰王昆仑杨旸周明拓罗喜良
Owner SHANGHAI TECH UNIV
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More