Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A method and storage device for managing cache memory

A cache and storage device technology, applied in memory systems, electrical digital data processing, instruments, etc., can solve the problems of wasting cache space, low cache utilization efficiency, and fullness, and achieve the effect of ensuring efficient utilization.

Active Publication Date: 2016-12-07
HUAWEI TECH CO LTD
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The above-mentioned prior art provides a method for managing data in the cache. However, in this prior art, the frequency at which the CLOCK pointer rotates determines the storage time of the data in the cache. When the accessed data is a sequential stream (Sequence stream includes two or more data with consecutive addresses in the hard disk), the data of the sequence stream will quickly fill up the entire cache, thus wasting cache space and resulting in low cache utilization efficiency

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A method and storage device for managing cache memory
  • A method and storage device for managing cache memory
  • A method and storage device for managing cache memory

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0035] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings.

[0036] figure 1 It is a schematic flow chart of an embodiment of the method for managing cache in the present invention. In a specific implementation, the method for managing a cache in the embodiment of the present invention may be applied to a storage device, where the storage device includes a cache. Such as figure 1 As shown, the method of the embodiment of the present invention may include:

[0037] In step S110, the storage device determines whether the target data stored in the cache is data in the sequence stream.

[0038] In specific implementation, the sequence stream in the embodiment of the present invention may include two or more data with consecutive addresses in low-speed devices such as hard disks, and there are various methods for determining ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

An embodiment of the invention discloses a method for managing a cache and a storage device. The method is applied to the storage device. The storage device comprises the cache. The method comprises that the storage device determines whether target data saved in the cache are data in the sequence flow; the storage device writes the target data in a data elimination queue in the cache when determining that the target data are data in the sequence flow; the storage device performs elimination on the target data which are written in the data elimination queue by the cache according to a first-in first-out principle. According to the method and the storage device, sequence flow data in the cache can be eliminated rapidly, and the efficient utilization rate of the cache can be guaranteed.

Description

technical field [0001] The invention relates to the field of data storage, in particular to a method and a storage device for managing a cache. Background technique [0002] Cache is a first-level memory that exists between the main memory (for example, hard disk) and the CPU. It is composed of a static memory chip (Static RAM, SRAM). The capacity is relatively small but the speed is much higher than that of the main memory, and it is close to the speed of the CPU. . The principle of locality is the theoretical basis of cache. The principle of locality is divided into temporal locality and spatial locality. Among them, temporal locality means: if data is accessed at time point T0, then within a period of time starting from T0, the possibility of the data being accessed again will be higher than before T0. Spatial locality means: if data is accessed at time point T0, then within a period of time starting from T0, other data around the data are more likely to be accessed th...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F3/06G06F12/0846G06F12/0815G06F12/0866
Inventor 龚涛
Owner HUAWEI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products