Pipe-type communication method and system for interprocess communication

A technology of inter-process communication and communication method, which is applied in the field of pipeline communication method and system, and can solve the problems of reducing the reliability and efficiency of pipeline communication, limited memory storage space, memory buffer occupation, etc.

Inactive Publication Date: 2011-07-13
NAT UNIV OF DEFENSE TECH
View PDF4 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

When the amount of data received by server 1 is large, the memory buffer of the first process of server 1 may also be fully occupied, resulting in the occurrence of the above problems
[0006] It can be seen that because the pipeline communication method in the prior art mainly uses memory space to buffer data, and the storage space of the memory is limited, when the memory buffer space is all occupied, the reliability and efficiency of pipeline communication will be reduced

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Pipe-type communication method and system for interprocess communication
  • Pipe-type communication method and system for interprocess communication
  • Pipe-type communication method and system for interprocess communication

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0148] The subject of execution of this method is any one of multiple servers performing data pipeline parallel processing. see figure 1 , is a flowchart of the pipeline communication method according to Embodiment 1 of the present invention. Such as figure 1 As shown, the method includes the steps of:

[0149] S101: Receive the data processed by the process of the upper-level server;

[0150] The upper-level server refers to a server that processes data in a previous step among multiple servers that perform pipeline parallel processing on data. For example, the server is responsible for processing the data in Step 3, then the upper-level server refers to the server that processes the data in Step 2.

[0151] S102: Cache the data to the first cache pool;

[0152] The buffer pool is a storage space set on the external memory. External storage, that is, external storage, refers to storage other than computer memory and CPU cache. Usually, the storage space on the external...

Embodiment 2

[0163] see figure 2 , is a flowchart of the pipeline communication method according to Embodiment 2 of the present invention. Such as figure 2 As shown, the method includes the steps of:

[0164] S201: Receive the data processed by the process of the upper-level server;

[0165] S202: Cache the data to the first slave cache pool;

[0166] In this embodiment, the first cache pool includes a first slave cache pool and a first primary cache pool. The first slave cache pool is a storage space set on the external storage, and is used for caching the data processed by the process of the upper-level server that needs to be received.

[0167] S203: Set the first secondary cache pool as the first primary cache pool;

[0168] The first main buffer pool is a storage space set on the external memory, and is used for caching data to be read by the first memory buffer. Since the storage space on the external storage can only be read or written at the same time, the setting of the fi...

Embodiment 3

[0177] see image 3 , is a flowchart of the pipeline communication method according to Embodiment 3 of the present invention. Such as image 3 As shown, the method includes the steps of:

[0178] S301: Receive the data processed by the process of the upper-level server;

[0179] S302: Cache the data to the first cache pool;

[0180] S303: Read the data in the first buffer pool and cache it in the first memory buffer, so that the process of the server can process it;

[0181] S304: Write the data processed by the process of the server into the second memory buffer;

[0182] S305: Cache the data of the second memory buffer to the second main cache pool;

[0183] S306: Set the second master cache pool as a second slave cache pool;

[0184] S307: Send the data in the second secondary buffer pool to the process of the next-level server for processing.

[0185] In this embodiment, the second cache pool includes a second master cache pool and a second slave cache pool. Both t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a pipe-type communication method for interprocess communication, comprising the following steps: receiving the data processed by the process of a server at the previous level; caching the data to a first caching pool; reading the data in the first caching pool, and caching the data to a first memory buffer zone so as to bring convenience for the process of the server to process; writing the data processed by the process of the server into a second memory buffer zone; caching the data in the second memory buffer zone to the second caching pool; and sending the data in the second caching pool to the process of the server at the next level to process, wherein the first caching pool and the second caching pool are storage spaces arranged on an external memory. The invention also discloses a pipe-type communication system for interprocess communication. According to the method and the system, data in the pipe-type communication mode in the streamline concurrent processing process are cached by the large-capacity storage space of the external memory, thereby improving the reliability and the efficiency for data transmission during pipe-type communication.

Description

technical field [0001] The invention relates to the technical field of data processing, in particular to a pipeline communication method and system for inter-process communication. Background technique [0002] When multiple servers are used to process data collaboratively, the following method is usually adopted: Assume that the data to be processed are: D1, D2, D3, D4, D5... For each data, four Step processing. Server 1 performs step 1 processing on data D1, and the processed result is transmitted to server 2 for step 2 processing. In this way, when server 4 performs step 4 processing on data D1, server 3 performs step 3 processing on data D2, server 2 performs step 2 processing on data D3, and server 1 performs step 1 processing on data D4. This data processing method is called pipeline parallelism. [0003] In the prior art, in the process of pipeline parallel data processing, different servers usually use pipeline communication to transmit data. If the process that ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/54G06F15/173
Inventor 杨树强滕猛王怀民吴泉源贾焰周斌韩伟红陈志坤赵辉舒琦金松昌罗荣凌王凯
Owner NAT UNIV OF DEFENSE TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products