Method for prompting IO (input/output) parallelism and reducing small IO delay by utilizing multiple request queues

A request queue and multi-request technology, which is applied to multi-programming devices, resource allocation, etc., can solve problems such as reduced efficiency, inability to make full use of multi-core systems, and affect the parallelism of IO processing, so as to reduce delay, improve processing efficiency, and improve The effect of IO throughput

Active Publication Date: 2012-12-19
HUAZHONG UNIV OF SCI & TECH
View PDF2 Cites 32 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In a single request queue environment, there is no mechanism for distinguishing between small IO requests and large IO requests. All IO requests are processed in the same way, with similar delays, which cannot meet small IO requests with high real-time requirements.
A single request queue in the environment of a large number of small IO requests, the request queue will become the object of competition for all IO requests, the efficiency of IO merging and IO scheduling will be reduced, and the multi-core system cannot be fully utilized, which will greatly affect the parallelism of IO processing , increasing the processing delay of small IO requests, which cannot meet the real-time requirements
The existing methods to meet the real-time requirements of small IO requests are mainly by modifying the scheduler of the request queue to grant priority processing rights to small IO requests, which fails to solve the problem that a single request queue is the bottleneck of parallel processing

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for prompting IO (input/output) parallelism and reducing small IO delay by utilizing multiple request queues
  • Method for prompting IO (input/output) parallelism and reducing small IO delay by utilizing multiple request queues
  • Method for prompting IO (input/output) parallelism and reducing small IO delay by utilizing multiple request queues

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] The present invention will be described in further detail below in conjunction with the accompanying drawings and specific embodiments. The following examples are illustrative only, and are not intended to limit the present invention.

[0031] The present invention aims at improving the IO parallel processing capability and ensuring the real-time requirements of small IOs by implementing multi-request queues and multi-request queue selection strategies in a multi-core environment.

[0032] The present invention first needs to establish a multi-request queue. Implement multi-request queues, create a unified interface for block devices, and create enable, mode, and bind interfaces for block devices. enable is the interface for enabling and disabling the multi-request queue, mode is the interface for using the number of request queues, and bind is the interface for policy selection. You can modify the operation mode of the multi-request queue through these three interface...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a method for prompting IO (input / output) parallelism and reducing small IO delay by utilizing multiple request queues. The method comprises steps: establishing multiple request queues and ensuring that the IO requests to select corresponding request queues to process by utilizing the selection strategy, so as to realize the parallel running of the IO requests, wherein the selection strategy comprises: binding each process with one request queue so as to evenly allocate the IO requests on a plurality of processes on the plurality of request queues for processing, and binding each CPU (Central Processing Unit) so as to evenly allocate the IO requests on a plurality of CPUs on the plurality of request queues for processing. The invention further discloses application of the method in an FC or FCoE storage system. A great amount of IO requests are allocated in the plurality of request queues according to the strategy, thus realizing the parallel processing of the IO requests, prompting the processing efficiency of the IO requests, achieving the effect of the IO throughput rate, and prompting the real-time IO processing efficiency and reducing the processing delay of the small IO requests by allocating more queues for the small IO requests.

Description

technical field [0001] The invention relates to the technical field of computer memory, in particular to a method for improving IO parallelism and reducing small IO delay. Background technique [0002] At present, with the development of high-performance computing and application services, more high-performance storage area networks are beginning to be constructed using high-speed networks and proprietary protocols, which mainly include IP-SAN based on Ethernet and iSCSI protocols, Fiber Channel and FCP-based The FC-SAN of the protocol, and the new FCoE storage area network based on Ethernet and Fiber Channel. In FC-SAN and FCoE storage area networks, the storage device of the target system is mounted to the initiator system through the storage network protocol and accessed as a block device. With the development of storage network technology, especially the 10G network With the introduction of , the bandwidth of the initiator system to access the target device has also bee...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/50
Inventor 刘景宁童薇冯丹吴龙飞林超
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products