Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

High-performance unblocking parallel memory management device for coordinative executed parallel software

A memory and memory pool technology, which is applied in memory systems, uses stored programs for program control, and concurrent instruction execution, and can solve problems such as complexity and error-proneness

Inactive Publication Date: 2005-09-14
IBM CORP
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The resulting dichotomy creates an environment that tends to impose constraints on various proposed solutions and creates complexity and error-proneness during implementation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • High-performance unblocking parallel memory management device for coordinative executed parallel software
  • High-performance unblocking parallel memory management device for coordinative executed parallel software
  • High-performance unblocking parallel memory management device for coordinative executed parallel software

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0040] In describing the preferred embodiment of the invention, reference will be made to the accompanying drawings 1-11, wherein like numerals indicate like features of the invention. Features of the invention are not necessarily shown to scale in the drawings.

[0041] parallel software processing system

[0042] To overcome serialization limitations in accessing system services during parallel processing, the present invention tailors a programming approach under high-level language syntax that implicitly removes these considerations from the programmer's purview resulting in significant improvements in parallel applications. Specifically, the invention provides, in various aspects, a coordination system that naturally separates the data space and each parallel thread, a method of associating threads and data spaces, and a high-level language to describe and manage this separation. The system of the present invention containing the structure described further below may be ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A method for allocating memory in a parallel processing computing system in which there is first provided a system memory available for parallel processing and first and second threads. The method includes using the first thread to request memory from the system memory; allocating to the first thread a first pool of memory and associating the memory pool with the second thread; using the second thread to request memory from the system memory; allocating to the second thread a second pool of memory and associating the memory pool with the first thread; using the first thread to request further memory from the second thread; and allocating to the first thread a portion of the second pool of memory from the second thread without making a request to the system memory.

Description

[0001] This application and U.S. Patent Application (Attorney Docket No. FIS990317US) entitled "Parallel Software Processing System" and U.S. Patent Application (Attorney Docket No. FIS990317US) entitled "Method of Using Distinct Streams of Computational Control as Reusable Data Objects" No. FIS990319US), both of which are filed by the inventors of the present application and filed on the same date as the present application. technical field [0002] This invention relates to computer processing, and more particularly to parallel computer programming and processing. Background technique [0003] In prior art computing utilizing discrete, non-parallel processing, programs often share data and other components. An example of this is shown in Figure 1, where separate process memories 19a, 19b, which may be physically separated in different memory stores or logically separated in the same memory store, contain data items for data items visible to the entire process Global varia...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F12/02G06F9/06G06F9/38G06F9/45G06F9/46G06F9/50G06F12/00
CPCG06F9/5016G06F9/38
Inventor 哈里·J·比迪三世彼得·C·阿尔门德福
Owner IBM CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products