Unlock instant, AI-driven research and patent intelligence for your innovation.

A Multi-Edge Node Incremental Computing Offloading Method Oriented to Edge Intelligence

An incremental computing and multi-edge technology, applied in transmission systems, electrical components, etc., can solve problems such as limited computing, storage, or energy consumption resources, and achieve the goal of properly handling concurrency conflicts, ensuring execution efficiency, and improving robustness Effect

Active Publication Date: 2022-04-01
INNER MONGOLIA UNIV OF TECH
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, most edge nodes have very limited computing, storage or energy resources

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Multi-Edge Node Incremental Computing Offloading Method Oriented to Edge Intelligence
  • A Multi-Edge Node Incremental Computing Offloading Method Oriented to Edge Intelligence
  • A Multi-Edge Node Incremental Computing Offloading Method Oriented to Edge Intelligence

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] In order to make the purpose, technical solution and advantages of the present invention clearer, the edge intelligence-oriented multi-edge node incremental computing offloading strategy method of the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0030] Such as figure 1 As shown, the present invention is an edge intelligence-oriented multi-edge node incremental computing offloading method, which is used to improve the execution efficiency and robustness of edge deep learning (DNN) applications. It consists of two main phases:

[0031] (1), planning stage

[0032] First, the central server and multiple edge nodes perform information perception, including estimated waiting time, network speed, available nodes and forecast files, etc., and ana...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

An edge intelligence-oriented multi-edge node incremental computing offloading method, in the planning stage, the central server and multi-edge nodes perform information perception, determine the edge nodes participating in collaborative computing according to the information, construct a multi-edge node DNN collaborative execution graph, and determine the needs Initially upload to the DNN layer of other nodes first, and record the corresponding target node, and then sort the DNN layers based on the delay improvement to determine the subsequent upload sequence; in the execution phase, try to upload and run the DNN according to the execution graph generated in the planning phase Model, if a node detects an abnormality based on the cooperative conflict detection mechanism of the invalid lock, the current DNN layer upload request of the node will be forcibly terminated by the invalid lock, and if no conflict is detected, the upload will continue until the execution result of the DNN model is obtained. The present invention effectively prevents a single edge server from being easily affected by network fluctuations by cooperating with multiple targets to perform DNN calculations, and achieves higher execution efficiency than a method based only on edge servers.

Description

technical field [0001] The invention belongs to the technical field of edge computing task offloading, relates to computing offloading of DNN tasks among multiple edge nodes, and is an edge intelligence-oriented multi-edge node incremental computing offloading method. Background technique [0002] Since the DNN application is very computationally intensive, it is difficult to run it alone on a resource-constrained edge node. In recent years, studies have proposed the use of edge nodes and edge servers to collaboratively execute DNNs on demand. A more popular approach is to upload the DNN model of the edge node to the edge server on demand for DNN calculation. However, these practices are based on edge cloud servers with pre-installed DNN models, which do not meet the vision of uploading DNN models on demand. Secondly, relying only on a single remote edge server is extremely susceptible to network fluctuations and other factors, causing congestion, resulting in increased ti...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): H04L41/042H04L41/0823H04L41/14H04L67/025H04L67/1004H04L67/30
CPCH04L41/042H04L41/0823H04L41/145H04L67/025H04L67/1004H04L67/30
Inventor 庄旭菲陈忠民许志伟张润秀
Owner INNER MONGOLIA UNIV OF TECH
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More