Management method of a computer cache system
The technology of a cache system and management method, which is applied in the management field of computer cache systems, can solve problems such as increasing cache errors and reducing performance, and achieve the effects of reducing cache errors, ensuring stability, and having a wide range of applications
- Summary
- Abstract
- Description
- Claims
- Application Information
AI Technical Summary
Problems solved by technology
Method used
Image
Examples
Embodiment 1
[0038] Such as Figure 1~3 As shown, the CPU independent chip 100 in the CPU system of the present invention integrates a CPU core 110, a secondary cache 130, a memory access controller MMU 140 and four memory channels. The CPU core 110 is built with a CPU execution unit 116 , a first-level instruction cache 112 (ie, L1-I Cache), and a first-level data cache 114 (ie, L1-D Cache). The secondary cache 130 directly exchanges data with the CPU core 110, and the four memory channels (i.e. memory channel one 152, memory channel two 154, memory channel three 156, and memory channel four 158) communicate with the memory access controller MMU 140 to accept its management instructions.
[0039] The memory access controller MMU 140 exchanges data with the instruction and data filling mechanism of the CPU core 110 . figure 1 The first cache of the independent CPU chip 100 in the CPU adopts a separate storage structure for instructions and data: instructions are stored in the first-level...
Embodiment 2
[0048] Such as Figure 4A , 5A As shown, each cache line in this embodiment has a TAG storage area 450 , a Data storage area 460 , and four identification bits: V identification bit 410 , H identification bit 420 , A identification bit 430 and D identification bit 440 . Wherein, the V flag 410 represents that the cache line is valid (Valid); the H flag 420 represents that the cache line has been hit (Hit), and when the cache line was initially loaded, the H flag 420 was set to zero, if the cache line Hit, then be set as 1; A flag bit 430 marks this cache line to have been allocated (Allocated) by replacement algorithm, and this flag bit is used for reminding replacement algorithm not to repeatedly allocate same cache line to be replaced; D flag bit 440 represents this cache line The content of has been changed (Dirty), and after being replaced out of the cache, the changed content needs to be written into the memory.
[0049] Compared with Embodiment 1, this embodiment diffe...
Embodiment 3
[0060] Such as Image 6 , 7 As shown, the cache line in this embodiment has a TAG storage area 670, a Data storage area 680, and 6 identification bits: a V identification bit 610, an H identification bit 620, an A identification bit 630, a D identification bit 640, and a P identification bit 650 and U identification bit 660.
[0061] Wherein, the V flag 610 represents that the cache line is valid (Valid); the H flag 620 represents that the cache line has been hit (Hit), and when the cache line is initially loaded, the H flag 620 is set to zero, if the cache line Hit, then set to 1; A flag 630 indicates that the cache line has been allocated by the replacement algorithm (Allocated); D flag 640 indicates that the content of the cache line has been changed (Dirty), after being replaced out of the cache, need Write the changed content into memory; if the P flag 650 is 1, it means that the cache line is the prefetch (Prefetch) content, if it is zero, it means that the cache line ...
PUM
Abstract
Description
Claims
Application Information
- R&D Engineer
- R&D Manager
- IP Professional
- Industry Leading Data Capabilities
- Powerful AI technology
- Patent DNA Extraction
Browse by: Latest US Patents, China's latest patents, Technical Efficacy Thesaurus, Application Domain, Technology Topic, Popular Technical Reports.
© 2024 PatSnap. All rights reserved.Legal|Privacy policy|Modern Slavery Act Transparency Statement|Sitemap|About US| Contact US: help@patsnap.com