Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method and system for providing indeterminate read data latency in a memory system

a memory system and read data technology, applied in data switching networks, frequency-division multiplexes, instruments, etc., can solve the problems of adding additional gaps to the read data packets that are returned, adding latency to the average read operation, and hubs not being able to use indeterminate techniques to return read data faster or slower, etc. , to facilitate indeterminate read data latency, facilitate determining, and minimize read data latency

Inactive Publication Date: 2007-08-09
IBM CORP
View PDF99 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

The patent describes a method and device for improving the speed of data transmission in a memory system. The method involves storing received data packets into a buffer device and transmitting them to an upstream driver for transmission to a memory controller. The device includes a mechanism for determining if a data packet has been received and if the upstream driver is idle, and then transmitting or storing the data packet accordingly. This method helps to minimize read data latency and enable indeterminate read data return times to the memory controller. The patent also describes a memory system with one or more memory modules and hub devices for buffering and transmitting data to the memory controller. The system uses a frame format with an identification tag and frame start indicator for efficient data transmission.

Problems solved by technology

During run time operations, these two restrictions result in additional gaps being added to packets of read data that are returned from the memory modules.
This adds latency to the average read operation.
In addition, hubs are not able to use indeterminate techniques to return read data faster or slower than normal.
Preventing data corruption by avoiding data collisions is especially complicated as hub devices merge local read data onto a cascaded memory controller channel.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and system for providing indeterminate read data latency in a memory system
  • Method and system for providing indeterminate read data latency in a memory system
  • Method and system for providing indeterminate read data latency in a memory system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0016] Exemplary embodiments utilize controller channel buffers (CCBs), read data frame formats with identification tags and a preemptive data merge technique to enable minimized and indeterminate read data latency. Exemplary embodiments allow memory modules to return read data to a memory controller at an unpredicted time. Identification tag information is added to the read data packet to indicate the read command that the data is a result of, as well as the hub where the data was read. The identification tag information is utilized by the controller to match the read data packet to the read commands issued by the controller. By using the identification tag information, read data can be returned in an order that is different from the issue order of the corresponding read commands.

[0017] Exemplary embodiments also provide a preemptive data merge process to prevent data collisions on the upstream channel when implementing the indeterminate read data latency. A CCB is added to the hu...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A method and system for providing indeterminate read data latency in a memory system. The method includes determining if a local data packet has been received. If a local data packet has been received, then the local data packet is stored into a buffer device. The method also includes determining if the buffer device contains a data packet and determining if an upstream driver for transmitting data packets to a memory controller via an upstream channel is idle. If the buffer contains a data packet and the upstream driver is idle, then the data packet is transmitted to the upstream driver. The method further includes determining if an upstream data packet has been received. The upstream data packet is in a frame format that includes a frame start indicator and an identification tag for use by the memory controller in associating the upstream data packet with its corresponding read instruction. If an upstream data packet has been received and the upstream driver is not idle, then the upstream data packet is stored into the buffer device. If an upstream data packet has been received and the buffer device does not contain a data packet and the upstream driver is idle, then the upstream data packet is transmitted to the upstream driver. If the upstream driver is not idle, then any data packets in progress are continued being transmitted to the upstream driver.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS [0001] This application is a continuation of U.S. patent application Ser. No. 11 / 289,193 filed Nov. 28, 2005, the contents of which are incorporated by reference herein in their entirety.BACKGROUND OF THE INVENTION [0002] This invention relates to memory systems comprised of hub devices connected to a memory controller by a daisy chained channel. The hub devices are attached to, or reside upon, memory modules that contain memory devices. More particularly, this invention relates to the flow control of read data and the identification of read data returned to the controller by each hub device. [0003] Many high performance computing main memory systems use multiple fully buffered memory modules connected to a memory controller by one or more channels. The memory modules contain a hub device and multiple memory devices. The hub device fully buffers command, address and data signals between the memory controller and the memory devices. The flow of...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(United States)
IPC IPC(8): H04J1/16
CPCG06F13/1673G06F13/1657
Inventor COTEUS, PAUL W.GOWER, KEVIN C.MAULE, WARREN E.TREMAINE, ROBERT B.
Owner IBM CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products