Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method and system for optimizing parallel I/O (input/output) by reducing inter-progress communication expense

An inter-process communication and overhead technology, which is applied in the directions of inter-program communication, concurrent instruction execution, multi-program installation, etc., to achieve the effect of reducing communication overhead, obvious performance, and obvious improvement.

Active Publication Date: 2015-07-15
HUAZHONG UNIV OF SCI & TECH
View PDF5 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In this way, the data request in the file domain of the proxy process running on a certain machine node may come from other machine nodes. When the proxy process exchanges data with other processes, some cross-machine node data communication overhead will be generated.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and system for optimizing parallel I/O (input/output) by reducing inter-progress communication expense
  • Method and system for optimizing parallel I/O (input/output) by reducing inter-progress communication expense
  • Method and system for optimizing parallel I/O (input/output) by reducing inter-progress communication expense

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0041] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention. In addition, the technical features involved in the various embodiments of the present invention described below can be combined with each other as long as they do not constitute a conflict with each other.

[0042] The overall idea of ​​the present invention is to provide a matching strategy between an agent process and a machine node based on various program operating parameters provided by users in advance and configuration information of the current cluster environment. Through this strategy, configure the deployment of the agent process on the machine node, so that as many processes as possible ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for improving parallel I / O (input / output) efficiency by reducing inter-progress communication based on progress affinity. The method comprises the following steps of firstly, analyzing a parallel I / O program using Collective I / O as mainstream, and collecting and calculating the node information of a cluster machine and the configuration information of an MPI (message passing interface) program; enabling the system to calculate the matching result of a possible machine node and an agent progress through pre-processing, and determining an optimum matching strategy through a property predicting module; finally, writing the pre-processed matching strategy into a configuration file, namely a machine file. The method has the advantages that after indicating by the experiment result, the configuration is simple, and the optimum progress distributing scheme is determined for the running of the program through the simple and quick pre-processing under the condition of not modifying the original program code, so the inter-progress communication expense is reduced, and the parallel I / O property is improved; the problem of lower parallel I / O efficiency in the existing high-property calculating field is solved, and the property is obviously improved by reducing the inter-progress data communication expense under the condition of no influence on the other deployment of the existing system.

Description

technical field [0001] The invention belongs to the field of I / O subsystems in high-performance computing, and more specifically relates to a method and system for improving parallel I / O efficiency based on reducing inter-process communication overhead. Background technique [0002] In recent years, big data processing has continued to heat up in the field of scientific computing. On the one hand, it reflects the growing demand for big data processing in the field of scientific computing. At the same time, it can also be seen that in the field of traditional high-performance , many scientific computing applications are changing from computation-intensive to data-intensive, and high-performance computing is facing new challenges brought by big data. [0003] The Data Transmission Parallel I / O Interface (Message-Passing Interface, MPI-IO) standard formulates a parallel I / O interface—aggregated I / O (Collective I / O) interface, which provides scientific computing in the era of bi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F9/54G06F9/38
Inventor 石宣化金海王志翔黎明
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products