Unlock instant, AI-driven research and patent intelligence for your innovation.

Memory management using dynamically allocated dirty mask space

A technology of memory and memory system, applied in memory system, memory architecture access/allocation, memory address/allocation/relocation, etc., can solve the problems of consuming memory space and weakening system performance, etc.

Active Publication Date: 2017-10-31
QUALCOMM INC
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Some of these write scenarios may impair system performance or consume excessive memory space

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Memory management using dynamically allocated dirty mask space
  • Memory management using dynamically allocated dirty mask space
  • Memory management using dynamically allocated dirty mask space

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0018] One scheme to maintain data coherency is sometimes referred to as a "read-assign-write" scheme. Upon receiving a write request, the processor may first read the target cache line from system memory, and then the processor may write the selected data unit (eg, byte) to the cache. Unwritten data cells have the same value as system memory. When a cache line is evicted from cache, the entire cache line is sent to system memory. Any unaffected data cells can be written with the same value. With this scheme, any write to a cache line results in a read to system memory. This results in extra traffic to system memory and undesirable latency for write requests. In modern digital systems, memory bandwidth can often be the bottleneck of system performance. This may be especially true for graphics processing units (GPUs). Therefore, this approach may not be preferable due to the increased traffic of the system memory.

[0019] Another scheme to maintain data coherency involve...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention discloses systems and methods related to memory systems including cache memories. The cache system includes: a cache including a plurality of cache lines; and a dirty buffer including a plurality of dirty masks. The cache controller is configured to assign one of the dirty masks to each of the cache lines when the write to the cache line is not a full write to the cache line the corresponding cache line. Each of the dirty masks indicates a dirty state of a data unit in one of the cache lines. The cache controller stores identification ID information that associates the dirty mask with the cache line to which the dirty mask is assigned.

Description

technical field [0001] The present invention relates to memory management, and more particularly to the management of cache memory. Background technique [0002] Cache memory, also called cache memory, is used in a variety of data processing systems to speed up access to data. A byte-writable cache allows a client to write some bytes of a cache line while leaving other bytes unaffected. When writing to a byte-writable cache, it is important to maintain data coherency. Various byte-writable cache write schemes are available to maintain data coherency. Some of these write scenarios may impair system performance or consume excessive memory space. Contents of the invention [0003] In an example, the dirty buffer may include a plurality of dirty masks assigned to the corresponding cache line when a write to the cache line is not a complete write to the cache line. In an example, a dirty buffer may be part of a cache memory. In other examples, it may be separate from the c...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F12/08
CPCG06F12/0804G06F12/0886G06F12/0895G06F12/0891G06F2212/604
Inventor 梁坚于春徐飞
Owner QUALCOMM INC