Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-level cache security

A high-speed cache and high-speed cache line technology, applied in memory systems, multi-programming devices, climate sustainability, etc., can solve problems such as delay extension

Pending Publication Date: 2022-01-04
TEXAS INSTR INC
View PDF1 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Such delays can be extended when the cache is configured to protect certain areas of the cache memory from being read or altered by at least one CPU that would otherwise be allowed to access the line of the cache

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-level cache security
  • Multi-level cache security
  • Multi-level cache security

Examples

Experimental program
Comparison scheme
Effect test

example

[0075] Example interfaces include CPU-DMC, CPU-PMC, DMC-UMC, PMC-UMC, SE-UMC, UMC-MSMC, MMU-UMC, and PMC-MMU interfaces. The CPU-DMC includes 512-bit vector reads and 512-bit vector writes and 64-bit scalar writes. The CPU-PMC contains 512-bit reads. The DMC-UMC includes 512-bit read and 512-bit write interfaces for performing cache transactions, snoop transactions, L1 DSRAM DMA, and external MMR accesses (eg, where each such interface can handle 2 data-phase transactions). The PMC-UMC interface contains 512-bit reads (which support 1 or 2 data phase reads). The SE-UMC interface contains 512-bit reads (which support 1 or 2 data phase reads). UMC-MSMC The UMC interface includes 512-bit reads and 512-bit writes (with overlapping snoop and DMA transactions). The MMU-UMC interface contains page table lookups from the L2. The PMC-MMU interface contains μTLB misses to the MMU.

[0076] L1P 311 contains a 32KB L1P cache as a 4-way set associative with a cache line size of 64 byt...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

In described examples, a coherent memory system includes a central processing unit (CPU) and first and second level caches. The CPU is arranged to execute program instructions (1000) to manipulate data in at least a first or second secure context. Each of the first and second caches stores (e.g., 1050) a secure code for indicating the at least first or second secure contexts by which data for a respective cache line is received. The first and second level caches maintain coherency in response to comparing (1020) the secure codes of respective lines of cache and executing (1030) a cache coherency operation in response.

Description

Background technique [0001] The processing device may be formed as part of an integrated circuit, such as part of a system on chip (SoC). In some examples, the SoC includes at least one central processing unit (CPU), where each CPU of the SoC is coupled to an integrated (eg, shared) memory system. The memory system may include, for example, multi-level cache memory (e.g., static RAM—SRAM—formed on an integrated circuit of the SoC) and at least one main memory (e.g., dynamic RAM—DRAM and / or DDR—which may be integrated on the SoC. memory outside the circuit). [0002] Increasingly complex memory architectures continue to provide scalability challenges when more and more powerful CPUs are added (or coupled) to processing devices. When multiple CPUs share a common address space of a memory system, scalability challenges still exist and can become even greater. Portions of the common address space of the shared memory may include various levels of a coherent cache (eg, where eac...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F12/0897
CPCG06F12/0811G06F12/0828G06F12/1081G06F21/79G06F12/128G06F12/0864G06F12/0831G06F2212/1028G06F2212/1024G06F2212/1052G06F12/1441G06F12/1483Y02D10/00G06F9/467
Inventor 阿布希吉特·A·查查德D·M·汤普森N·布霍里亚
Owner TEXAS INSTR INC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products