Caching system and method of data caching

A caching system and data caching technology, applied in memory systems, electrical digital data processing, memory address/allocation/relocation, etc., can solve the problems of low transmission rate, different error correction time, and inability to be the same speed, so as to improve the transmission rate. effect of speed

Active Publication Date: 2012-03-21
NETAK TECH KO LTD
View PDF4 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the speed of each channel on the low-speed interface side is usually inconsistent. For example, in a flash memory system, whether there is an error in the flash memory data, as well as the number of errors and the location of the error will cause different error correction times, resulting in the same The speed of the channel will also chang

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Caching system and method of data caching
  • Caching system and method of data caching
  • Caching system and method of data caching

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0025] Example one

[0026] See figure 2 , Which is a schematic structural diagram of an embodiment of a cache system of this application, such as figure 2 As shown, it includes a high-speed interface 201, a cache group 202, a low-speed interface 203, a status register 204, and a conditional command sequence 205. The high-speed interface 201 is one channel, and the low-speed interface 203 is at least two channels. The number of caches in the cache group 202 At least one more channel than the low-speed interface,

[0027] The conditional command sequence 205 is used to control the idle high-speed interface 201 and the low-speed interface 203 through control commands to execute data caching according to cache conditions. The cache conditions include data write conditions for writing data to the cache group 202 and data write conditions from the cache group 202 A data read condition for reading data in the data, the data write condition is a cache with a state of empty, and the data...

Example Embodiment

[0035] Example two

[0036] according to figure 1 As shown in the schematic diagram of the structure of the cache system, the embodiment of the present application provides a figure 1 The method for implementing data caching in the shown caching system includes: when there is an idle input channel and there is a cache with an empty state, writing the data to be transmitted into the cache with an empty state through the idle input channel When there is an idle output channel and a buffer with a full state, data is read from the buffer with a full state through the idle output channel.

[0037] For example, channel H is a high-speed interface, channels A, B, and C are low-speed interfaces, and there are 4 caches 0, 1, 2 and 3 in the cache system. When data is transmitted from a high-speed interface to a low-speed interface, the input channel is channel H, and the output channel is channels A, B, and C. When a data is written to buffer 0 through channel H, channel H is idle at this t...

Example Embodiment

[0048] Example three

[0049] The following describes the data caching process from the side of the channel as the main body of execution. Among them, the channel in this embodiment may be an input channel or an output channel. See Figure 4 , Which is a flowchart of an embodiment of a data caching method of this application, including the following steps:

[0050] Step 401: The channel is started by the CPU;

[0051] Step 402: When the channel is in an idle state, read the buffer condition from the conditional command sequence under the control of the control command;

[0052] Among them, for the input channel, the read data write condition, for the output channel, the read data read condition.

[0053] Step 403: The channel reads the status of each buffer from the status register under the control of the control command;

[0054] Step 404: Determine whether the current cache condition is satisfied according to the status of each cache, if yes, go to step 405, if not, go back to step ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The embodiment of the invention discloses a caching system and a method of data caching. The system comprises a high-speed interface, a cache group, a lower-speed interface, a status register, and a command sequence with conditions, wherein the command sequence with conditions is used for controlling the idle high-speed interface and low-speed interface to execute data caching according to caching conditions by the control command, and the caching conditions comprise data read-in conditions and data reading conditions; the data read-in conditions are that a cache in an empty status exists, and the data reading conditions are that a cache in a full status exists; and the high-speed interface and the low-speed interface are used for reading the caching conditions and cache statuses when the high-speed interface and the low-speed interface are in an idle status, judging whether the caching conditions are met according to the cache statuses, executing data caching and simultaneously updating the cache statuses when the caching conditions are met, and the status register is used for storing the status of each cache of the high-speed interface and the low-speed interface. According to the embodiment of the invention, the transmission rate of data between the high-speed interface and the low-speed interface can be improved.

Description

technical field [0001] The present application relates to the technical field of data storage, in particular to a cache system and a data cache method. Background technique [0002] When data is transmitted between two modules whose read and write speeds do not match, or between two modules working in different clock domains, a buffer circuit is usually used to buffer the transmitted data, so that the slow module Get the highest possible data transfer rate. For example, when a computer with a high-speed interface accesses a low-speed device with a low-speed interface such as an external memory or a printer, because the read and write speeds between the high-speed interface and the low-speed interface do not match, the high-speed device and the low-speed device need to be set. A cache device buffers data transferred between the two. [0003] Generally, data caching methods include ping-pong caching and circular caching. Wherein, the circular cache device adopts a plurality...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F12/08G06F12/0866G06F12/0877
Inventor 罗盛裕
Owner NETAK TECH KO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products