Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Caching system and method of data caching

A caching system and data caching technology, applied in memory systems, electrical digital data processing, memory address/allocation/relocation, etc., can solve the problems of low transmission rate, different error correction time, and inability to be the same speed, so as to improve the transmission rate. effect of speed

Active Publication Date: 2012-03-21
NETAK TECH KO LTD
View PDF4 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the speed of each channel on the low-speed interface side is usually inconsistent. For example, in a flash memory system, whether there is an error in the flash memory data, as well as the number of errors and the location of the error will cause different error correction times, resulting in the same The speed of the channel will also change over time, and eventually the speed between the channels will not be the same
At this time, when continuing to apply the circular buffer method based on the multi-channel low-speed interface in the prior art for data caching, the transmission rate of data is low when the data is transmitted between the high-speed interface and the low-speed interface

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Caching system and method of data caching
  • Caching system and method of data caching
  • Caching system and method of data caching

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0026] see figure 2 , which is a schematic structural diagram of an embodiment of a caching system in the present application, such as figure 2 Shown, comprise high-speed interface 201, cache group 202, low-speed interface 203, status register 204 and band command sequence 205, wherein, high-speed interface 201 is a channel, low-speed interface 203 is at least two channels, the cache quantity of cache group 202 at least one more than the number of lanes of the low-speed interface,

[0027] The conditional command sequence 205 is used to control the idle high-speed interface 201 and the low-speed interface 203 to perform data caching according to the caching conditions through the control command. The data reading condition of reading data in, described data writing condition is the cache that has state and is empty, and described data read condition is the cache that has state and is full;

[0028] The high-speed interface 201 and the low-speed interface 203 are used to re...

Embodiment 2

[0036] according to figure 1 The schematic diagram of the structure of the caching system shown in the embodiment of the present application provides a figure 1 The method for implementing data caching in the cache system shown, the method includes: when there is an idle input channel and a cache with an empty state, write the data to be transmitted into the cache with an empty state through the idle input channel In , when there is an idle output channel and a full buffer, read data from the full buffer through the idle output channel.

[0037] For example, channel H is a high-speed interface, channels A, B, and C are low-speed interfaces, and there are four caches 0, 1, 2, and 3 in the cache system. When data is transferred from the high-speed interface to the low-speed interface, the input channel is channel H, and the output channels are channels A, B, and C. When a piece of data is written to cache 0 through channel H, channel H is idle at this time, and if only cache 2...

Embodiment 3

[0049] The following describes the data caching process from the channel side as the execution subject. Wherein, the channel in this embodiment may be an input channel or an output channel. see Figure 4 , which is a flowchart of an embodiment of a data caching method of the present application, including the following steps:

[0050] Step 401: the channel is started by the CPU;

[0051] Step 402: When the channel is in an idle state, read the cache condition from the conditional command sequence under the control of the control command;

[0052] Wherein, for the input channel, the data writing condition is read, and for the output channel, the data reading condition is read.

[0053] Step 403: the channel reads the status of each buffer from the status register under the control of the control command;

[0054]Step 404: judge whether the current cache condition is satisfied according to the status of each cache, if yes, enter step 405, if not, return to step 403;

[0055...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention discloses a caching system and a method of data caching. The system comprises a high-speed interface, a cache group, a lower-speed interface, a status register, and a command sequence with conditions, wherein the command sequence with conditions is used for controlling the idle high-speed interface and low-speed interface to execute data caching according to caching conditions by the control command, and the caching conditions comprise data read-in conditions and data reading conditions; the data read-in conditions are that a cache in an empty status exists, and the data reading conditions are that a cache in a full status exists; and the high-speed interface and the low-speed interface are used for reading the caching conditions and cache statuses when the high-speed interface and the low-speed interface are in an idle status, judging whether the caching conditions are met according to the cache statuses, executing data caching and simultaneously updating the cache statuses when the caching conditions are met, and the status register is used for storing the status of each cache of the high-speed interface and the low-speed interface. According to the embodiment of the invention, the transmission rate of data between the high-speed interface and the low-speed interface can be improved.

Description

technical field [0001] The present application relates to the technical field of data storage, in particular to a cache system and a data cache method. Background technique [0002] When data is transmitted between two modules whose read and write speeds do not match, or between two modules working in different clock domains, a buffer circuit is usually used to buffer the transmitted data, so that the slow module Get the highest possible data transfer rate. For example, when a computer with a high-speed interface accesses a low-speed device with a low-speed interface such as an external memory or a printer, because the read and write speeds between the high-speed interface and the low-speed interface do not match, the high-speed device and the low-speed device need to be set. A cache device buffers data transferred between the two. [0003] Generally, data caching methods include ping-pong caching and circular caching. Wherein, the circular cache device adopts a plurality...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F12/08G06F12/0866G06F12/0877
Inventor 罗盛裕
Owner NETAK TECH KO LTD
Features
  • Generate Ideas
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More