Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Cache storage architecture

A storage architecture and memory technology, applied in the field of cache storage architecture, can solve problems such as back-and-forth handling blocking, lowering the working efficiency of cache memory, etc., and achieve the effect of improving work efficiency and reducing communication blocking

Active Publication Date: 2018-11-27
ZHUHAI JIELI TECH
View PDF5 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Based on this, it is necessary to address the above-mentioned information access and synchronization related to the consistency between the cache memories. When the data address conflicts during the data reading and writing process, it will cause the blockage of the cache line between the cache memories to and fro, resulting in the work efficiency of the cache memory. Reduce technical problems and provide a cache storage architecture

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cache storage architecture
  • Cache storage architecture
  • Cache storage architecture

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0046] In order to make the purpose, technical solution and advantages of the present application clearer, the present application will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present application, and are not intended to limit the present application.

[0047] see figure 1 , figure 1 It is a structural schematic diagram of the cache storage architecture in an implementation of the present invention; in the present embodiment, the cache storage architecture includes: multiple cores, a cache memory and physical memory; each core is connected to the cache memory respectively; the cache memory and the physical memory are connected in groups linked mapping relationship.

[0048] In this embodiment, the cache memory generally consists of three parts: a content buffer, a tag buffer, and a management circuit. The content buffe...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

This application relates to a cache storage architecture. The cache storage architecture includes: multiple kernels, a cache memory, and a physical memory; each kernel is connected with the cache memory; the cache memory is connected with the physical memory in group associative mapping. In this scheme, multiple kernels are connected to the physical memory via the same cache memory, and multiple kernels form a unified cache memory that does not need to distinguish the cache memory of the kernel from the cache memories of other kernels and does not need to design consistency protocol to ensuresynchronization between cache memories, which reduce data consistency-related information access and synchronous communication, thereby reducing traffic congestion and improving the efficiency of cache memory.

Description

technical field [0001] The present application relates to the technical field of integrated circuits, in particular to a cache storage architecture. Background technique [0002] In order to solve the problem that the operating speed of the central processing unit does not match the reading and writing speed of the large-capacity physical main memory, a cache memory (cache) is usually provided between the central processing unit and the large-capacity physical main memory. [0003] Cache memory is generally composed of three parts: content cache (random access device used for instruction content or data content storage in cache), tag cache (random access device used for instruction tag or data tag storage in cache) and management circuit. The content buffer caches the instruction or data content of the physical memory, which can be divided into data content buffer and instruction content buffer; the tag buffer records the main memory address and other status information of ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F12/1045
CPCG06F12/1054
Inventor 龙树生
Owner ZHUHAI JIELI TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products