Low latency device interconnect using remote memory access with segmented queues

A low-latency, queuing technology, applied in the field of communication between computing devices, can solve problems such as target failure

Inactive Publication Date: 2016-03-02
TSX
View PDF4 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Conventional queuing mechanisms at the application layer specifying to satisfy this need contribute a great deal of extra processing and network transpo

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Low latency device interconnect using remote memory access with segmented queues
  • Low latency device interconnect using remote memory access with segmented queues
  • Low latency device interconnect using remote memory access with segmented queues

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0024] now refer to figure 1 , a system for a low latency data communication link between computing devices during normal operation is shown generally at 10 . It should be understood that system 10 is an illustrative example and that various systems contemplated for low-latency data communication links between computing devices will be apparent to those skilled in the art. System 10 includes a first computing device 12 having a first processor 14 and memory 15 and a second computing device 16 having a second processor 18 and memory 19 . The computing devices 12 , 16 are interconnected by a first link 20 and a second link 22 .

[0025] The computing devices described herein may be computers, servers, or any similar devices. The term "computing device" is not intended to limit its scope.

[0026] More specifically, first computing device 12 also includes a first communication interface card 24 in communication with first processor 14, and second computing device 16 includes a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

A writing application on a computing device can reference a tail pointer to write messages to message buffers that a peer-to-peer data link replicates in memory of another computing device. The message buffers are divided into at least two queue segments, where each segment has several buffers. Messages are read from the buffers by a reading application on one of the computing devices using an advancing head pointer by reading a message from a next message buffer when determining that the next message buffer has been newly written. The tail pointer is advanced from one message buffer to another within a same queue segment after writing messages. The tail pointer is advanced from a message buffer of a current queue segment to a message buffer of a next queue segment when determining that the head pointer does not indicate any of the buffers of the next queue segment.

Description

[0001] priority statement [0002] This application claims priority from US 61 / 834,615, filed June 13, 2013, which is hereby incorporated by reference in its entirety. technical field [0003] This specification relates generally to communications between computing devices, and more specifically, to data communications utilizing remote storage access. Background technique [0004] Social networking increasingly relies on computing devices and networks to interact and process business. To achieve the high levels of availability required in critical systems, unplanned downtime due to software and hardware defects should be reduced. [0005] Several modern applications require distributed, cooperative systems where computing devices can communicate with each other rapidly, often referred to as cluster computing, grid computing, or high-performance computing. Configurations typically consist of several loosely-coupled or tightly-coupled computing devices exchanging data at exc...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): H04L12/863H04L12/12H04L12/58H04L12/879H04L49/901
CPCH04L12/6418H04L47/6295H04L49/901G06F15/167H04L12/06H04L67/104
Inventor 格雷戈里·阿瑟斯·阿伦都铎·莫洛森
Owner TSX
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products