Feedback driving and adjusting system for efficient parallel running

A technology for running platforms and tasks, applied in concurrent instruction execution, machine execution devices, program control design, etc.

Active Publication Date: 2014-01-29
SAP AG
View PDF3 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Thus, and depending on how multiple tasks are configured for their parallel computations, varying tasks may again be adversely affected by the associated computational overhead to varying degrees
[0007] Thus, when creating tasks and/or configuri

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Feedback driving and adjusting system for efficient parallel running
  • Feedback driving and adjusting system for efficient parallel running
  • Feedback driving and adjusting system for efficient parallel running

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032] figure 1 is a block diagram of a system 100 for feedback-driven regulation running in parallel. exist figure 1 In an example, parallel execution manager 102 may be configured to execute tasks 104 in parallel by utilizing platform 106 . As shown and described, the platform 106 is capable of running multiple, parallel threads of execution, such as figure 1 This is illustrated by the illustration of processing cores 106A, 106B, . . . 106N. More specifically, as described in detail below, parallel execution manager 102 may be configured to actively manage how and to what extent parallelization of tasks 104 using platforms 106 occurs over time. Specifically, parallel execution manager 102 may be configured to achieve optimization of parallelization in a manner that is generally agnostic to the type or nature of platform 106 and does not require extensive knowledge of platform 106 and associated The manner and extent to which the parallelization parameters respond to the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

A parallel execution manager may determine a parallel execution platform configured to execute tasks in parallel using a plurality of available processing threads. The parallel execution manager may include a thread count manager configured to select, from the plurality of available processing threads and for a fixed task size, a selected thread count, and a task size manager configured to select, from a plurality of available task sizes and using the selected thread count, a selected task size. The parallel execution manager may further include an optimizer configured to execute an iterative loop in which the selected task size is used as an updated fixed task size to obtain an updated selected thread count, and the updated selected thread count is used to obtain an updated selected task size. Accordingly, a current thread count and current task size for executing the tasks in parallel may be determined.

Description

technical field [0001] This description refers to parallel processing. Background technique [0002] The size of large databases and other software applications can be a limiting factor in the utility of such applications, especially when queries, calculations, operations, and other tasks are themselves long and complex. For example, a user may want to issue a complex query to obtain results from a relational database having thousands or millions of records, in which case the response time to provide the corresponding query results may be prohibitively long. Furthermore, such a situation may result in an inefficient use of available computing resources, for example, by allowing resources to be consumed excessively by one user relative to other current users. [0003] The availability of multi-core (eg, multi-CPU) computing systems has spurred the development of techniques for parallel execution as a way to mitigate this effect. For example, multiple tasks (and / or portions ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F9/38
CPCG06F9/5027G06F2209/5018G06F2209/5017
Inventor 黎文宪贾学锋
Owner SAP AG
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products