Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Batch streaming computing system performance guarantee method based on queue modeling

A computing system and batch flow technology, applied in the field of distributed computing, can solve the problems of inability to evaluate performance, reduce the efficiency of performance guarantee, and not comprehensively consider performance dependencies.

Inactive Publication Date: 2017-08-22
BEIJING UNIV OF TECH
View PDF4 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This is difficult to adapt to the characteristics of rapid changes in the load of the streaming system, resulting in a lag in performance guarantees, and cannot evaluate system performance in a timely manner when the load intensity changes
[0005] (2) The performance dependencies of the components in each stage are not comprehensively considered, and the performance bottleneck cannot be accurately located
When the system performance fails to meet the expected goal, the existing technology does not consider the complex performance dependencies among the components, but simply selects a key component to implement an optimization scheme, which cannot accurately locate the performance bottleneck, and cannot optimize the adopted optimization scheme. The performance evaluation after the optimization of the scheme reduces the efficiency of performance assurance
[0006] In general, there is currently no performance assurance method for batch streaming computing systems based on accurate performance evaluation and bottleneck location

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Batch streaming computing system performance guarantee method based on queue modeling
  • Batch streaming computing system performance guarantee method based on queue modeling
  • Batch streaming computing system performance guarantee method based on queue modeling

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0081] The present invention will be described below in conjunction with the accompanying drawings and specific embodiments.

[0082] The present invention describes the specific implementation of the proposed performance guarantee method in combination with Spark Streaming, a batch stream computing system widely used at present. figure 1 It is a deployment diagram of the batch stream computing platform on which this method is attached. The platform is composed of multiple computer servers (platform nodes), and the servers are connected through a network. Platform nodes are divided into two categories: including a management node (Master) and multiple computing nodes (Slave). The platform on which the present invention is attached includes the following core software modules: a resource management module, a node management module, an application management module, a data receiving module, a data management module and a data calculation module. Among them, the resource manage...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a batch streaming computing system performance guarantee method based on queue modeling. The performance guarantee method comprises five steps of dividing a process; selecting components; carrying out performance modeling; computing delay; and positioning and optimizing a bottleneck. According to the method, for the feature that load intensities have clear fluctuation in an operation process of a batch streaming computing system, key components in the batch streaming computing system are extracted, a performance model of the system is established according to a queuing theory and mathematical analysis is carried out on the model; data processing delay of the system under different load intensities is computed through utilization of the model in the operation process of the system; and when the data processing delay cannot satisfy a data processing timeliness demand, a performance bottleneck component is positioned according to the queuing theory, and a configuration optimization suggestion is provided.

Description

technical field [0001] The invention belongs to the field of distributed computing, and in particular relates to a performance analysis and optimization method of a batch streaming computing system. Background technique [0002] Streaming data is an important data type of big data, which has the characteristics of continuity, volatility, and dynamics. Big data streaming computing is an analysis and processing technology for streaming data. It takes data processing timeliness as the performance goal and quickly mines the value of streaming data. Batch streaming computing is an important branch of big data streaming computing. The core technical feature of batch streaming computing is to divide the received streaming data into multiple small batches in chronological order, and use MapReduce-like batch computing to process them periodically. Batch streaming computing has broad demand and application prospects in real-time data stream processing of IoT sensors and social netwo...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04L12/24H04L12/861H04L12/26H04L29/08
CPCH04L41/0893H04L41/142H04L41/145H04L43/08H04L49/90H04L67/10
Inventor 梁毅侯颖苏超陈诚丁治明
Owner BEIJING UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products