Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Cache management method and device

A cache management and cache technology, applied in digital transmission systems, electrical components, transmission systems, etc., can solve the problems of cache space not being used, cache space waste, etc., to achieve effective use of cache space, improve utilization, and save The effect of managing space

Active Publication Date: 2015-01-14
NEW H3C TECH CO LTD
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In this case, the buffer space of a certain channel will be used up, while other channels have free buffer space but cannot be used, which will also cause a waste of buffer space

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cache management method and device
  • Cache management method and device
  • Cache management method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0021] The core idea of ​​the embodiment of the present invention is: add a used resource record module and a used resource query judgment module in the cache management module, point to the data block through the two-level address pointer, and when writing the data block, the data block is written according to the The cache large block address pointer and the block address pointer are written into the free small block one by one; when reading the data block, it is read according to the read cache large block address pointer and the block internal address pointer.

[0022] Wherein, the size of the cache block and the small block can be divided according to actual needs, but the length of the cache block should be smaller than the maximum packet length. Assuming that the maximum packet length is 16Kbyte (byte), and the size of the entire cache space is 8Gbit (bit), the cache block granularity of the first level division can be 32K bytes, and the small block granularity of the se...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a cache management method, which comprises the steps of dividing a cache space into cache blocks and dividing each cache block into a plurality of small blocks, wherein the length of each cache block is smaller than maximum packet length; and dividing a data packet into a plurality of data blocks according the length of each small block and writing the divided data blocks into unoccupied small blocks one by one according to write cache large block address pointers corresponding to the cache blocks and in-block address pointers corresponding to the small blocks. The invention additionally provides a cache management device. By adopting the cache management method and the cache management device, the utilization ratio of cache resources can be improved and the cache management resources can be saved.

Description

technical field [0001] The invention relates to data cache technology, in particular to a cache management method and device. Background technique [0002] In logic design, it is often necessary to use dynamic random access memory (DRAM), static random access memory (SRAM) or field programmable gate array (FPGA) internal random access memory to cache data. How to manage the cache space is very important in logic design a link. Usually, buffer resources are managed in a first-in-first-out (FIFO) manner, that is, packets are sequentially stored in the FIFO queue, and then taken out sequentially from the FIFO queue. [0003] In order to meet the delay requirements, FIFO generally adopts the tail discarding mechanism, that is, after receiving the packet enqueue request, it first judges whether the FIFO has space to store a maximum packet, and if so, puts the received data packet into the cache; otherwise Just discard the packet. That is to say, as long as the remaining space ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): H04L12/70H04L29/06H04L49/9015
Inventor 王彬
Owner NEW H3C TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products