Method and system for providing indeterminate read data latency in a memory system

a memory system and read data technology, applied in data switching networks, frequency-division multiplexes, instruments, etc., can solve the problems of adding additional gaps to the read data packets that are returned, adding latency to the average read operation, and hubs not being able to use indeterminate techniques to return read data faster or slower, etc. , to facilitate indeterminate read data latency, facilitate determining, and minimize read data latency

Inactive Publication Date: 2007-08-09
IBM CORP
View PDF99 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

[0007] Exemplary embodiments include a hub device in a memory system. The hub device includes a device for receiving data packets, an upstream driver for transmitting data packets to a memory controller via an upstream channel and a mechanism including instructions for facilitating indeterminate read data latency. The device for receiving data packets includes an upstream receiver for receiving upstream data packets from a downstream hub device and a memory interface for receiving local data packets from a local storage device. Each data packet is in a frame format that includes a frame start indicator and an identification tag for use by a memory controller in associating the data packet with its corresponding read instruction. The instructions on the mechanism facilitate determining if a local data packet has been received. If a local data packet has been received, then the local data packet is stored into a buffer device. The instructions also facilitate determining if the buffer device contains a data packet and determining if the upstream driver is idle. If the buffer contains a data packet and the upstream driver is idle, then the data packet is transmitted to the upstream driver. The instructions further facilitate determining if an upstream data packet has been received. If an upstream data packet has been received and the upstream driver is not idle, then the upstream data packet is stored into the buffer device. If an upstream data packet has been received and the buffer device does not contain a data packet and the upstream driver is idle, then the upstream data packet is transmitted to the upstream driver. If the upstream driver is not idle, then any data packets in progress are continued being transmitted to the upstream driver.
[0008] Exemplary embodiments include a memory subsystem with one or more memory modules. The memory modules include one or more memory devices connected to a memory controller by a daisy chained channel. The read data is returned to the memory controller using a frame format that includes an identification tag and frame start indicator. The memory system also includes one or more hub devices on the memory modules for buffering address, commands and data. The hub devices include controller channel buffers that are used in conjunction with a preemptive local data merge algorithm to minimize read data latency and enable indeterminate read data return times to the memory controller.
[0009] Further exemplary embodiments include a memory system with one or more memory modules. The memory modules include memory devices that are connected to a memory controller by a daisy chained channel. The read data is returned to the memory controller using a frame format that includes an identification tag and frame start indicator. The memory system also includes one or more hub devices connected to the memory modules for buffering address, commands and data. The hub devices include controller channel buffers that are used in conjunction with a preemptive local data merge algorithm to minimize read data latency and enable indeterminate read data return times to the memory controller.

Problems solved by technology

During run time operations, these two restrictions result in additional gaps being added to packets of read data that are returned from the memory modules.
This adds latency to the average read operation.
In addition, hubs are not able to use indeterminate techniques to return read data faster or slower than normal.
Preventing data corruption by avoiding data collisions is especially complicated as hub devices merge local read data onto a cascaded memory controller channel.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and system for providing indeterminate read data latency in a memory system
  • Method and system for providing indeterminate read data latency in a memory system
  • Method and system for providing indeterminate read data latency in a memory system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0016] Exemplary embodiments utilize controller channel buffers (CCBs), read data frame formats with identification tags and a preemptive data merge technique to enable minimized and indeterminate read data latency. Exemplary embodiments allow memory modules to return read data to a memory controller at an unpredicted time. Identification tag information is added to the read data packet to indicate the read command that the data is a result of, as well as the hub where the data was read. The identification tag information is utilized by the controller to match the read data packet to the read commands issued by the controller. By using the identification tag information, read data can be returned in an order that is different from the issue order of the corresponding read commands.

[0017] Exemplary embodiments also provide a preemptive data merge process to prevent data collisions on the upstream channel when implementing the indeterminate read data latency. A CCB is added to the hu...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

A method and system for providing indeterminate read data latency in a memory system. The method includes determining if a local data packet has been received. If a local data packet has been received, then the local data packet is stored into a buffer device. The method also includes determining if the buffer device contains a data packet and determining if an upstream driver for transmitting data packets to a memory controller via an upstream channel is idle. If the buffer contains a data packet and the upstream driver is idle, then the data packet is transmitted to the upstream driver. The method further includes determining if an upstream data packet has been received. The upstream data packet is in a frame format that includes a frame start indicator and an identification tag for use by the memory controller in associating the upstream data packet with its corresponding read instruction. If an upstream data packet has been received and the upstream driver is not idle, then the upstream data packet is stored into the buffer device. If an upstream data packet has been received and the buffer device does not contain a data packet and the upstream driver is idle, then the upstream data packet is transmitted to the upstream driver. If the upstream driver is not idle, then any data packets in progress are continued being transmitted to the upstream driver.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS [0001] This application is a continuation of U.S. patent application Ser. No. 11 / 289,193 filed Nov. 28, 2005, the contents of which are incorporated by reference herein in their entirety.BACKGROUND OF THE INVENTION [0002] This invention relates to memory systems comprised of hub devices connected to a memory controller by a daisy chained channel. The hub devices are attached to, or reside upon, memory modules that contain memory devices. More particularly, this invention relates to the flow control of read data and the identification of read data returned to the controller by each hub device. [0003] Many high performance computing main memory systems use multiple fully buffered memory modules connected to a memory controller by one or more channels. The memory modules contain a hub device and multiple memory devices. The hub device fully buffers command, address and data signals between the memory controller and the memory devices. The flow of...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(United States)
IPC IPC(8): H04J1/16
CPCG06F13/1673G06F13/1657
Inventor COTEUS, PAUL W.GOWER, KEVIN C.MAULE, WARREN E.TREMAINE, ROBERT B.
Owner IBM CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products