Remote memory access using memory mapped addressing among multiple compute nodes

a memory mapped and compute node technology, applied in the field of communication, can solve problems such as efficient inter-process communication

Inactive Publication Date: 2017-12-07
CISCO TECH INC
View PDF5 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Yet, sharing data across the compute nodes with more effective and

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Remote memory access using memory mapped addressing among multiple compute nodes
  • Remote memory access using memory mapped addressing among multiple compute nodes
  • Remote memory access using memory mapped addressing among multiple compute nodes

Examples

Experimental program
Comparison scheme
Effect test

example embodiments

[0014]Turning to FIG. 1, FIG. 1 is a simplified block diagram illustrating a communication system 10 for facilitating remote memory access with memory mapped addressing among multiple compute nodes in accordance with one example embodiment. FIG. 1 illustrates a communication system 10 comprising a chassis 12, which includes a plurality of compute nodes 14 that communicate with network 16 through a common input / output (I / O) adapter 18. An upstream switch 20 facilitates north-south traffic between compute nodes 14 and network 16. Shared IO adapter 18 presents network and storage devices on a Peripheral Component Interconnect Express (PCIE) bus 22 to compute nodes 14. In various embodiments, each compute node appears as a PCIE device to other compute nodes in chassis 12.

[0015]In a general sense, compute nodes 14 include capabilities for processing, memory, network and storage resources. For example, as shown in greater detail in the figure, compute node Host1 runs (e.g., executes) an o...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

An example method for facilitating remote memory access with memory mapped addressing among multiple compute nodes is executed at an input/output (IO) adapter in communication with the compute nodes over a Peripheral Component Interconnect Express (PCIE) bus, the method including: receiving a memory request from a first compute node to permit access by a second compute node to a local memory region of the first compute node; generating a remap window region in a memory element of the IO adapter, the remap window region corresponding to a base address register (BAR) of the second compute node; and configuring the remap window region to point to the local memory region of the first compute node, wherein access by the second compute node to the BAR corresponding with the remap window region results in direct access of the local memory region of the first compute node by the second compute node.

Description

TECHNICAL FIELD[0001]This disclosure relates in general to the field of communications and, more particularly, to remote memory access with memory mapped addressing among multiple compute nodes.BACKGROUND[0002]Compute nodes such as microservers and hypervisor-based virtual machines executing in a single chassis can provide scaled out workloads in hyper-scale data centers. Microservers are an emerging trend of servers for processing lightweight workloads with large numbers (e.g., tens or even hundreds) of relatively lightweight server nodes bundled together in a shared chassis infrastructure, for example, sharing power, cooling fans, and input / output components, eliminating space and power consumption demands of duplicate infrastructure components. The microserver topology facilitates density, lower power per node, reduced costs, and increased operational efficiency. Microservers are generally based on small form-factor, system-on-a-chip (SoC) boards, which pack processing capability...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F15/173G06F13/42H04L29/08H04L29/06
CPCG06F15/17331G06F13/4282H04L67/1097H04L69/16Y02D10/00H04L67/133
Inventor BORIKAR, SAGAR
Owner CISCO TECH INC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products