Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Large-scale resource scheduling system and large-scale resource scheduling method based on deep learning neural network

A neural network and deep learning technology, applied in the field of resource scheduling, can solve problems such as the lack of distributed parallel execution functions, achieve the effects of improving stability and scalability, improving training efficiency, and ensuring stability

Active Publication Date: 2018-04-06
WUHAN UNIV OF TECH
View PDF5 Cites 38 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In order to solve the above technical problems, the present invention proposes a large-scale resource scheduling system and method based on a deep neural network, which uses the parallel characteristics of a deep neural network to process data sets in a distributed form for the training model, and dynamically executes resource scheduling, effectively Solve the problem that the existing large-scale resource scheduling method lacks the distributed parallel execution function

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Large-scale resource scheduling system and large-scale resource scheduling method based on deep learning neural network
  • Large-scale resource scheduling system and large-scale resource scheduling method based on deep learning neural network
  • Large-scale resource scheduling system and large-scale resource scheduling method based on deep learning neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0021] In order to facilitate the understanding and implementation of the present invention by those of ordinary skill in the art, the present invention will be further described in detail with reference to the accompanying drawings and embodiments. It should be understood that the implementation examples described here are only used to illustrate and explain the present invention, and are not intended to limit this invention.

[0022] Please see figure 1 with figure 2 , The present invention provides a deep neural network large-scale resource scheduling system, which includes at least one scheduling control module and at least two execution modules. The scheduling control module is the core of the entire distributed resource scheduling. Its task: receiving user requests, allocating scheduling resources, and parallel computing status feedback; the execution module is the running body of task computing, and its task: receiving task requests sent by the scheduling control module ,...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention claims a large-scale resource scheduling system and a large-scale resource scheduling method based on a deep learning neural network. The system comprises at least one scheduling controlmodule and at least two execution modules; the scheduling control module is used for receiving a use request, allocating a scheduling resource and performing parallel computing of state feedback; theexecution modules are used for receiving a task request sent by the scheduling control module, opening up a memory space and performing computation. In the system and the method provided by the invention, a user task request interface is provided, and a scheduler receives a submitted task request information, predicts and judges whether a task satisfies expectation of task completion conditions of a user through the deep learning neural network and consequently determines an initialization parameter of a resource scheduling policy. The scheduler segments the task according to the resource scheduling policy and allocates the task to the execution modules to complete computation. While performing computation and tidying for the task, the execution modules feeds back resource information tothe scheduling control module to complete the user task uniformly.

Description

Technical field [0001] The invention belongs to the technical field of resource scheduling, and relates to a large-scale resource scheduling system and method based on a deep learning neural network. Background technique [0002] With the development of Internet technology, resource scheduling technology has matured day by day. The existing resource scheduling program generally starts from resource scheduling rules based on dynamic monitoring of resource load of resource pools to realize the redistribution of virtual machines among physical servers in the resource pool. When the resources to be allocated are too large, far beyond the scope of the resource scheduling rules, it may lead to unreasonable resource scheduling or even resource scheduling failure. [0003] Currently, there are roughly the following methods for large-scale resource scheduling in use: [0004] One is to use a clustered method for distributed training of deep neural networks and then to perform distributed res...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04L29/08G06F9/50
CPCG06F9/5027G06F9/5072H04L67/10
Inventor 邹承明刘春燕
Owner WUHAN UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products