Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Management method for cache system of computer

A technology of a cache system and a management method, applied in the management field of computer cache systems, can solve problems such as increasing cache errors and reducing performance, and achieves the effects of reducing cache errors, ensuring stability, and having a wide range of use.

Inactive Publication Date: 2015-10-28
GUANGZHOU YOUBEIDA INFORMATION TECH
View PDF8 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The purpose of the present invention is to overcome the existing useful data in the CPU cache will be replaced by the prefetched content when the current CPU is prefetching, thereby increasing the defects of cache errors and reducing performance, and providing a method that can effectively solve the above defects A kind of management method of computer cache system

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Management method for cache system of computer
  • Management method for cache system of computer
  • Management method for cache system of computer

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0035] Such as Figure 1~3 As shown, the CPU independent chip 100 in the CPU system of the present invention integrates a CPU core 110, a secondary cache 130, a memory access controller MMU 140, and four memory channels. The CPU core 110 has a built-in CPU execution mechanism 116, a first-level instruction cache 112 (ie, L1-I Cache), and a first-level data cache 114 (ie, L1-D Cache). The second-level cache 130 directly exchanges data with the CPU core 110, and the four memory channels (ie, memory channel one 152, memory channel two 154, memory channel three 156, and memory channel four 158) are connected with the memory access controller MMU 140 is connected to accept its management instructions.

[0036] The memory access controller MMU 140 exchanges data with the instruction and data filling mechanism of the CPU core 110. figure 1 The first cache of the independent CPU chip 100 in the CPU adopts a structure of separate storage of instructions and data: instructions are stored ...

Embodiment 2

[0045] Such as Figure 4A , 5A As shown, each cache line in this embodiment has a TAG storage area 450, a Data storage area 460, and four identification bits: V identification bit 410, H identification bit 420, A identification bit 430 and D identification bit 440. Among them, the V flag 410 represents that the cache line is valid (Valid); the H flag 420 indicates that the cache line has been hit (Hit). When the cache line is initially loaded, the H flag 420 is set to zero. If it hits, it is set to 1. The A flag 430 indicates that the cache line has been allocated by the replacement algorithm (Allocated). This flag is used to remind the replacement algorithm not to repeatedly allocate the same cache line to be replaced; the D flag 440 indicates the cache line The content of has been changed (Dirty), after being replaced out of the cache, the changed content needs to be written into the memory.

[0046] Compared with Embodiment 1, the difference between this embodiment and Embodim...

Embodiment 3

[0057] Such as Image 6 , 7 As shown, the cache line in this embodiment has a TAG storage area 670, a Data storage area 680, and 6 identification bits: V identification bit 610, H identification bit 620, A identification bit 630, D identification bit 640, and P identification bit 650 and U mark 660.

[0058] Wherein, the V flag 610 represents that the cache line is valid (Valid); the H flag 620 indicates that the cache line has been hit (Hit). When the cache line is initially loaded, the H flag 620 is set to zero. If it hits, it is set to 1. The A flag 630 indicates that the cache line has been allocated by the replacement algorithm (Allocated); the D flag 640 indicates that the content of the cache line has been changed (Dirty). After the cache is replaced, it needs to Write the changed content into the memory; if the P flag 650 is 1, it means that the cache line is prefetch content, if it is zero, it means the cache line is demand fetch content; the U flag 660 is in When the c...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a management method for a cache system of a computer. The management method is characterized in that the cache is composed of a plurality of cache lines, wherein each cache line comprises a plurality of data words; at the same time each cache line is divided into a plurality of subsets according to the address; each subset corresponds to one or the plurality of data words; each subset is provided with one or more local Sub-block identification bits; when operations like cache query and cache fill adopt granularity of the address corresponding to the subsets of the cache lines, the state and historical information of the corresponding subset of the cache lines can be recorded according to the address granularity corresponding to the subset of the cache lines, and the state and historical information can be stored in the local Sub-block identification bits corresponding to the subset. According to the management method for the cache system of the computer, provided by the invention, the prefetching ability of the CPU system for commands and data can be ensured, before the commands and the data are really used, commands and data are fetched back to the CPU from the cache or other storing mechanisms according to the request sent in advance, and the operating rate is obviously improved.

Description

[0001] This application is a divisional application for an invention patent whose original application number is 201210464057.8, the filing date is 2012.11.16, and the invention-creation title is "a method for managing a computer cache system". Technical field [0002] The invention relates to a management algorithm of a computer cache system, in particular to a management method of a CPU cache system. Background technique [0003] At present, computer systems have large delays when accessing memory and other low-level storage devices (such as hard disks and network devices). Taking memory access as an example, after the CPU sends out data and instruction access commands, it takes about 100 nanoseconds to get the data, which is equivalent to the time it takes for the CPU core to execute hundreds of instructions. Because the CPU system has certain rules for the use of instructions and data, according to these rules, we can design various methods to guess the instructions and data t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F12/08
Inventor 邹阳王去非
Owner GUANGZHOU YOUBEIDA INFORMATION TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products