Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Queuing system for buffering data from multiple input-stream, system and method

A queuing system and input stream technology, applied in the field of systems that can receive multiple input streams, can solve problems such as low memory usage efficiency

Inactive Publication Date: 2008-04-16
NXP BV
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

To put this into perspective, the number of unused memory locations in all queues at any one time is typically very high, so the memory usage of conventional multiple queue multiple input systems 100 is very inefficient

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Queuing system for buffering data from multiple input-stream, system and method
  • Queuing system for buffering data from multiple input-stream, system and method
  • Queuing system for buffering data from multiple input-stream, system and method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033] figure 2 A block diagram of an example of a multiple input queuing system 200 according to the present invention is shown. The system 200 includes a dual-port memory 220, wherein a distributor / arbiter 240 (hereinafter referred to as a distributor 240) is used to control writing to the memory 220, and a mapper / queue generator 250 (hereinafter referred to as a mapper 250) is used to control Reading out of the memory 220 is controlled. Write and read operations to memory 220 are symbolically represented by switch 210 and switch 260, respectively.

[0034] Such as figure 2 As shown, memory 220 includes P addressable memory cells each having a width W sufficient to accommodate any data item provided by input stream 101 . Refer to above for figure 1 Discussion of Prior Art System 100 Using conventional queuing theory techniques, the number P of storage locations required for a given level of confidence in avoiding memory 220 overflow can be determined from expected inpu...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A queuing system uses a common buffer for receiving input data from multiple-inputs, by allocating memory-elements in the common buffer to each input-stream, as the streams provide their input data. To allow for an independently controlled unloading of the individual data-items from the multiple-input common buffer, the system maintains a mapping of the memory locations of the buffer that is allocated to each data-item in each input-stream. To minimize the memory and overhead associated with maintaining a mapping of each data-item, memory locations that are allocated to each input-stream are maintained in a sequential, first-in, first-out queue. When a subsequent receiving device acknowledges that it is ready to receive a data-item from a particular input-stream, the identification of the allocated memory location is removed from the input-stream's queue, and the data-item that is at the allocated memory in the common buffer is provided to the receiving device.

Description

technical field [0001] The present invention relates to the field of computer and communication systems, and more particularly to a system that can receive multiple input streams that are routed to a common output port. Background technique [0002] Multiple input, common output systems are common in the art. For example, multiple hosts can communicate data with a shared server; multiple processors can access a shared storage device; multiple data streams can be sent to a shared transmission medium; and so on. Typically, the input to a multiple-input system is characterized by bursts of activity on one or more input streams. During these bursts of activity, the arrival rate of incoming data often exceeds the allowable data transmission rate to the underlying receiving system, so buffers must be provided to prevent data loss. [0003] Traditionally, one of two systems can be used to manage the routing of multiple input streams to a common output, depending on whether the de...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F5/06G06F7/00H04L12/56H04L29/06G06F13/38G06F3/00G06F13/00
CPCH04L49/901H04L49/90H04L47/621
Inventor V·安安德R·K·阿拉帕利
Owner NXP BV
Features
  • Generate Ideas
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More