Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Cache device based on adaptive routing and scheduling policy

A cache and scheduling strategy technology, applied in memory systems, memory architecture access/allocation, instruments, etc., can solve the problems of increased storage medium operation frequency, affecting the working life of storage systems, and wasting cache bandwidth, saving hardware design costs. , Data transmission delay time optimization, to achieve the effect of cache multiplexing

Active Publication Date: 2016-11-09
SHANGHAI SPACEFLIGHT INST OF TT&C & TELECOMM
View PDF3 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Due to the suddenness of data transmission and the physical performance of the storage system storage medium that cannot be duplexed, the buffer bandwidth is greatly wasted; and the operation frequency of the storage system storage medium also increases accordingly, which affects the working life of the storage system

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cache device based on adaptive routing and scheduling policy
  • Cache device based on adaptive routing and scheduling policy
  • Cache device based on adaptive routing and scheduling policy

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0022] The present invention will be further described below with reference to the accompanying drawings and specific embodiments.

[0023] Referring to the drawings showing embodiments of the invention, the invention will be described in more detail below. However, this invention may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, the embodiments are presented so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.

[0024] Such as figure 1 The shown cache device provided by the present invention based on an adaptive routing and scheduling strategy includes a controller and an external cache memory 2;

[0025] The controller includes a cache control module 101, a rate prediction module 102, an input cache I module 103, an input cache II module 107, an output cache I module 108, an output cache II module 106, an adaptive routing...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a cache device based on adaptive routing and scheduling policy. The cache device comprises a controller and an external cache memory component, wherein the controller comprises a speed pre-judging module, input buffer modules I and II, output buffer modules I and II, a routing and scheduling module, a cache memory control module and an SSD control module; and the adaptive routing and scheduling module realizes dynamic routing and storage of a data link by adopting an AOS virtual channel dynamic scheduling protocol. The cache memory component establishes a 'memory bank' by using an external cache memory to expand the data storage depth. In the invention, a three-level data route and a buffer mechanism are established by using the internal second-level buffer of the controller and the external cache memory; and through the dynamic routing and scheduling method, the buffering reuse of a high-speed input / output data channel of the external cache memory component is realized, and the utilization rate of the storage bandwidth of the caching memory component is increased.

Description

technical field [0001] The invention relates to a high-speed cache design method, in particular to a high-speed cache device based on adaptive routing and scheduling strategies. Background technique [0002] With the rapid development of information technology, the amount of information generated globally every year shows explosive growth, and the storage and processing of massive information puts forward very high requirements on the performance of storage systems. Through the application of cache technology, the interface compatibility between the storage system and other systems can be realized, the access speed of the system to data can be improved, and the service life of the storage system can be extended. [0003] At present, most domestic storage system caching technologies use external cache memory ping-pong operations. In applications with high system latency indicators, it is often necessary to configure two caches for storage links and playback links. Due to the...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F12/0811
CPCG06F12/0811G06F2212/1016G06F2212/1036
Inventor 濮建福范季夏张小峰罗唤霖陈克寒
Owner SHANGHAI SPACEFLIGHT INST OF TT&C & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products