Unlock instant, AI-driven research and patent intelligence for your innovation.

System, method, and apparatus for a cache flush of a range of pages and TLB invalidation of a range of entries

A cache line and entry technology, applied in the direction of memory system, program control design, memory architecture access/allocation, etc., can solve problems such as cache misses

Inactive Publication Date: 2011-07-06
INTEL CORP
View PDF3 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Otherwise, if the cache does not contain the requested data, a cache "miss" occurs

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • System, method, and apparatus for a cache flush of a range of pages and TLB invalidation of a range of entries
  • System, method, and apparatus for a cache flush of a range of pages and TLB invalidation of a range of entries
  • System, method, and apparatus for a cache flush of a range of pages and TLB invalidation of a range of entries

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0021] A technique for flushing a cache line can be associated with a linear or virtual memory address. At runtime, the technique flushes the cache line associated with the operand from all caches in the coherency domain. For example, in a multiprocessor environment, a given cache line is flushed from all cache hierarchy levels (ie, coherent domains) in all microprocessors of the system, depending on the processor state. The MESI (Modify, Exclusive, Shared, Invalidate) protocol, Write Invalidate protocol, provides each cache line with one of four states managed by two MESI bits. The four states also identify the four possible states of the cache line. If the processor is in "exclusive" or "shared" state, flushing is equivalent to invalidating the cache line. Another example holds when the processor is in the "modified" state. If the cache controller implements a write-back policy and only writes data from the processor to its cache on cache hits, the cache line contents mus...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Systems, methods, and apparatus for performing the flushing of a plurality of cache lines and / or the invalidation of a plurality of translation look-aside buffer (TLB) entries are described. In one such method, for flushing a plurality of cache lines of a processor, a single instruction includes a first field that indicates that the plurality of cache lines of the processor are to be flushed, and in response to the single instruction, flushes the plurality of cache lines of the processor.

Description

technical field [0001] Embodiments of the present invention relate generally to the field of information processing and, more specifically, to the field of cache and translation lookaside (TLB) maintenance. Background technique [0002] A cache memory device is a small, fast memory that can be used to contain the most frequently accessed data from a larger, slower memory. Random access memory (RAM) provides large amounts of storage capacity at relatively low cost. Unfortunately, access to RAM is slow relative to the processing speed of modern microprocessors. Although the storage capacity of cache memory may be relatively small, it provides high-speed access to the data stored in it. [0003] A cache is managed in such a way that it stores the instructions, translations or data that are most likely to be needed at a given time. A cache "hit" occurs when the cache is accessed and contains the requested data. Otherwise, if the cache does not contain the requested data, a c...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F12/08
CPCG06F12/0891G06F12/1009G06F12/08G06F12/1027G06F2212/1016G06F2212/683G06F9/30043G06F12/10
Inventor M·G·迪克逊S·D·罗杰斯
Owner INTEL CORP
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More