Hadoop load balance task scheduling method based on hybrid metaheuristic algorithm

A meta-heuristic algorithm and task scheduling technology, applied in computing, resource allocation, program control design, etc., can solve problems such as local optimality of heuristic algorithm, unstable performance, cumbersome solution process, etc., to overcome cluster load imbalance , revenue increase, and the effect of optimizing search capabilities

Active Publication Date: 2018-06-15
BEIJING UNIV OF TECH
View PDF6 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, since the number of nodes in the cluster is often very large, when using the heuristic algorithm to solve the problem, the solution process is very cumbersome and will cause a lot of extra overhead. In the traditional Hadoop task scheduling mode, the heuristic task scheduling will give the master node (Master node, responsible for job scheduling and distribution in Hadoop) creates a heavy burden, which affects the stability of the cluster. At the same time, heuristic algorithms such as particle swarms, ant colonies, and simulated annealing are prone to fall into local optimal problems. The performance during the scheduling process is also not stable

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Hadoop load balance task scheduling method based on hybrid metaheuristic algorithm
  • Hadoop load balance task scheduling method based on hybrid metaheuristic algorithm
  • Hadoop load balance task scheduling method based on hybrid metaheuristic algorithm

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] In order to illustrate the present invention more clearly, the present invention will be further described below in conjunction with preferred embodiments and accompanying drawings. Similar parts in the figures are denoted by the same reference numerals. Those skilled in the art should understand that the content specifically described below is illustrative rather than restrictive, and should not limit the protection scope of the present invention.

[0035] Such as figure 1 with figure 2 As shown, the technical field of a Hadoop load balancing task scheduling method based on a hybrid heuristic algorithm disclosed by the present invention comprises the following steps:

[0036] S1. A resource slot pressure model is established according to the principle of resource slots for computing pressure of processing tasks of balancing task processing nodes.

[0037] The main goal of the above resource slot pressure model is to make the computing pressure of the tasks to be ex...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a Hadoop load balance task scheduling method based on a hybrid metaheuristic algorithm. A resource slot pressure model is established; the model aims to enable calculation pressures of all Slave node processing tasks in a cluster to be in the same horizontal line; solution of an optimal task scheduling scheme is carried out by adopting the hybrid metaheuristic algorithm based on simulated annealing and particle swarm optimization; and load balance task scheduling in a Hadoop cluster environment is implemented. Further, parallel programming of the algorithm is implemented by a high-performance and wide-transportability MPICH (MPI over CHameleon), the calculating process of a heuristic optimization algorithm is transferred to an additional calculation node, and by simultaneous solution of various swarms, a calculation pressure of a Master node is reduced, and solution capacity of the optimal task scheduling scheme in unit time is promoted. According to the invention, calculation resources of a Hadoop cluster can be subjected to overall distribution, so that the nodes of the cluster are balanced in load, waste of the calculation resources of the nodes is avoided, and profits of equipment investment of a data center are maximized.

Description

technical field [0001] The invention relates to the field of task scheduling under the Hadoop MapReduce structure. More specifically, using particle swarm optimization and a hybrid meta-heuristic algorithm based on simulated annealing and particle swarm optimization, as well as the MPICH parallel programming method, a Hadoop task scheduling algorithm aimed at cluster load balancing. Background technique [0002] With the rapid development of mobile smart devices, the development of the information age has become more and more rapid. At the same time, with the use of the network by users, a large amount of data is actively or passively generated. These data are passed through Traditional statistics or calculation methods are usually unable to dig out the value, but once the potential value behind these data can be tapped, it can bring huge benefits to enterprises and governments. Analyze the user’s product preferences and needs, and at the same time push the products on the ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/50
CPCG06F9/5088
Inventor 毕敬程煜东乔俊飞
Owner BEIJING UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products