Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Low-delay task scheduling method for a dynamic fog computing network

A task scheduling and fog computing technology, applied in electrical components, transmission systems, etc., can solve problems such as affecting the use time of equipment, and achieve the effect of extending the use time and reducing energy overhead.

Active Publication Date: 2019-05-14
SHANGHAI TECH UNIV
View PDF5 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Continuously broadcasting and monitoring node status information in real time will generate a large amount of energy consumption and affect the service life of the equipment. In the future ultra-large-scale system, this problem will become more prominent
[0006] In the actual situation, nodes will dynamically enter and exit the network, and the shareable computing resources will also change with the situation. At the same time, it is hoped to minimize the energy consumption in the network and prolong the service life of the equipment. Such a problem that is most close to the actual demand and needs to be solved , but few scholars have studied

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Low-delay task scheduling method for a dynamic fog computing network
  • Low-delay task scheduling method for a dynamic fog computing network
  • Low-delay task scheduling method for a dynamic fog computing network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] The present invention will be further described below in conjunction with specific embodiments.

[0031] figure 1The flowchart of the low-latency task scheduling method oriented to dynamic fog computing networks provided in this embodiment, the “dynamic” oriented to dynamic fog computing networks includes the following three meanings: (1) The motion state of nodes in the network is variable ; (2) The size of the network is variable: nodes can freely enter and exit the network; (3) The computing resources provided by the helper nodes can be changed.

[0032] As an example, this embodiment provides a system block diagram such as figure 2 shown. Assume that at time t, within the communication range of the task node, there are N (N is a positive integer) candidate helper nodes. Since nodes are mobile, the nodes within the communication range of the task node will change, such as figure 2 , the upper left node may enter the communication range of the task node in the n...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a low-delay task scheduling method for a dynamic fog computing network, which is characterized in that a help node in the network cannot broadcast own node state information inreal time, such as task queue information and sharable computing resource information, or the help node cannot respond to requests of the information in real time. And when a task unloading requirement exists each time, the task node needs to make an unloading decision in real time, and an unloading task is selected from the current candidate help nodes. Due to the fact that the task nodes have unknown states for the help nodes and the tasks have time delay requirements, the task nodes need to well unload experiences from previous tasks for learning, and judgment is provided for current decision making. The invention provides a one-to-many task unloading algorithm based on an online learning method for non-dynamic and dynamic changing fog computing or edge computing networks, and the method can greatly reduce the energy expenditure caused by information transmission in the network and prolong the service time of task nodes and help nodes.

Description

technical field [0001] The invention belongs to the field of computing communication networks, and in particular relates to a task offloading algorithm aiming at reducing the average task offloading delay. Background technique [0002] With the rise and development of technologies such as intelligent Internet of Things, 5G, and artificial intelligence, processing massive and diverse data and meeting ultra-low service latency requirements have become more and more urgent problems to be solved. The traditional centralized cloud computing architecture has a large delay due to the long distance between the terminal device and the cloud server, and it is difficult to independently meet the needs of delay-sensitive services; access. In this context, fog computing came into being. With its distributed framework and low-latency services, fog computing is expected to become a key technology supporting future intelligent IoT, 5G and artificial intelligence applications, and has rece...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04L29/08
Inventor 谭友钰王昆仑杨旸周明拓罗喜良
Owner SHANGHAI TECH UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products