DNN task unloading method and terminal in edge-cloud hybrid computing environment

A hybrid computing and task technology, applied in computing, energy-saving computing, neural learning methods, etc., can solve problems such as long response time, reduce delay, network congestion, etc., achieve accurate cost estimation, ensure feasibility, and reduce costs. Effect

Active Publication Date: 2020-07-10
FUJIAN NORMAL UNIV
View PDF3 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] In recent years, the number of intelligent applications has increased rapidly. Among them, DNN (Deep Neural Networks, deep neural network) has achieved great success in many fields such as computer vision, speech recognition, and natural language processing. However, due to the huge model of DNN, mobile devices Due to the limited resources of the end, large-scale DNN applications are often deployed on remote cloud servers. Due to the long distance between the cloud server and the mobile device end, scheduling a large number of DNN applications to the remote cloud server will cause problems such as long response time and serious network congestion. It is not easy to guarantee the security of user data in long-distance transmission, which will inevitably lead to leakage of user privacy
[0003] After the emergence of edge computing, migrating DNN to edge nodes near the mobile device can greatly reduce the delay, and compared with the mobile device, the edge node has stronger computing power and advantages in computing resources, which can improve the execution performance of DNN applications , while reducing the overhead of the cloud server, it can also better protect user privacy. However, in the actual operation process, due to the complex hierarchical structure of DNN, the amount of data transmitted between layers and the complexity of computing tasks at different layers are different. Huge, and the edge network has a complex topology, which brings great difficulty to the deployment of DNN tasks and easily leads to higher system costs for edge computing

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • DNN task unloading method and terminal in edge-cloud hybrid computing environment
  • DNN task unloading method and terminal in edge-cloud hybrid computing environment
  • DNN task unloading method and terminal in edge-cloud hybrid computing environment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0105] Please refer to figure 1 , Embodiment 1 of the present invention is:

[0106] A DNN task offloading method in an edge-cloud hybrid computing environment, specifically comprising:

[0107] S1. According to the type and number of computing nodes in the edge-cloud hybrid computing environment, the number of DNN tasks to be offloaded, and the number of layers of each DNN task to be offloaded, the objective function is established based on the principle of total cost minimization, and determined Corresponding constraints;

[0108] First construct the DNN task offloading system model in the edge-cloud hybrid environment,

[0109] T={t 1 ,t 2 ,...,t n} represents the collection of all tasks. t i ={t i,1 ,t i,2 ,...,t i,n} represents a specific DNN task, g i Indicates that a task t is generated i node, a i Indicates the task t i Generation time, dl i means t i Deadline. t i,j Denotes the DNN task t i the jth layer. For DNN task t i Each layer of t i,j , da...

Embodiment 2

[0131] A DNN task offloading method in an edge-cloud hybrid computing environment, which is different from Embodiment 1 in that:

[0132] The calculation of pBest in the S4 is specifically:

[0133] Calculate the optimal solution pBset of the i-th particle after t iterations i t ;

[0134] According to the preset fitness function, the fitness values ​​of the i-th particle in the t-th iteration and the t-1-th iteration are calculated respectively, and pBset is obtained respectively i t The fitness value and pBest i t-1 the fitness value;

[0135] Set the pBset i t The fitness value and the pBest i t-1 The fitness value is compared, if the current fitness value is less than the pBest i t-1 The fitness value of , then pBest i t =pBest i t-1 ;

[0136] Otherwise, keep pBest i t constant;

[0137] Wherein, the preset fitness function is:

[0138] If both the particles of the t-th iteration and the particles of the t-1-th iteration can unload all the DNN tasks t...

Embodiment 3

[0154] Please refer to figure 2 , the third embodiment of the present invention is:

[0155] A DNN task offloading terminal 1 in an edge-cloud hybrid computing environment, including a memory 2, a processor 3, and a computer program stored in the memory 2 and operable on the processor 3, and the processor 3 executes the The steps in the first or second embodiment are realized when the computer program is described.

[0156] To sum up, the DNN task offloading method and terminal provided by the present invention in an edge-cloud hybrid environment are established according to the type and number of edge-cloud computing nodes, the number of DNN tasks to be offloaded, and the number of layers. The model establishes a mapping relationship between the tasks to be offloaded and the computing nodes in the DNN, and takes the minimum total cost as the objective function, which is easy to quantify and compare and meets actual expectations, and ensures the lowest cost when unloading DN...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a DNN task unloading method in an edge-cloud hybrid computing environment and a terminal. According to the types and the number of computing nodes, the number of DNN tasks to be unloaded and the number of layers of each DNN task to be unloaded, calculating the number of the DNN tasks to be unloaded; establishing a target function based on total cost minimization; determining a corresponding constraint condition. The influences of conditions such as computing power and time delay constraints of different types of nodes are considered, the feasibility of the obtained optimal solution is guaranteed, when the optimal solution is solved, crossover operation and mutation operation in the genetic algorithm are introduced into the particle swarm algorithm, a specific algorithm is given, and the problem that the particle swarm algorithm is prone to falling into local optimum in the optimal solution solving process is effectively solved.

Description

technical field [0001] The invention relates to the field of task offloading, in particular to a DNN task offloading method in an edge-cloud hybrid computing environment. Background technique [0002] In recent years, the number of intelligent applications has increased rapidly. Among them, DNN (Deep Neural Networks, deep neural network) has achieved great success in many fields such as computer vision, speech recognition, and natural language processing. However, due to the huge model of DNN, mobile devices Due to the limited resources of the end, large-scale DNN applications are often deployed on remote cloud servers. Due to the long distance between the cloud server and the mobile device end, scheduling a large number of DNN applications to the remote cloud server will cause problems such as long response time and serious network congestion. It is not easy to guarantee the security of user data in long-distance transmission, which will inevitably lead to leakage of user p...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/445G06N3/04G06N3/08G06N3/12
CPCG06F9/44594G06N3/08G06N3/126G06N3/045Y02D10/00
Inventor 林兵黄引豪陈星蔡飞雄
Owner FUJIAN NORMAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products