Check patentability & draft patents in minutes with Patsnap Eureka AI!

Memory prefetch system and method requiring latency and data value correlation

A technology of memory and data values, applied in the field of computing systems, which can solve problems such as operating efficiency limitations

Active Publication Date: 2022-08-02
MICRON TECH INC
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, at least in some instances, the operational efficiency of a computing system may be limited by its architecture, which, for example, governs the sequence of operations performed in the computing system

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Memory prefetch system and method requiring latency and data value correlation
  • Memory prefetch system and method requiring latency and data value correlation
  • Memory prefetch system and method requiring latency and data value correlation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0021] The present invention provides techniques that facilitate improving the operational efficiency of computing systems, for example, by mitigating architectural features that may otherwise limit operational efficiency. In general, a computing system may include various subsystems, such as a processing subsystem and / or a memory subsystem. In particular, a processing subsystem may include processing circuitry, eg, implemented in one or more processors and / or one or more processor cores. A memory subsystem may include one or more memory devices (eg, chips or integrated circuits), eg, implemented on a memory module, such as a dual in-line memory module (DIMM), and / or organized to implement one or more a memory array (eg, an array of memory cells).

[0022] Generally, during operation of a computing system, processing circuitry implemented in its processing subsystem may perform various operations by executing corresponding instructions, eg, to determine output data by perform...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

This application relates to memory prefetch systems and methods that require latency and data value correlation. memory control circuitry: instructs the memory array to read a block of data from or write the block of data to the location specified by the memory access request; determine memory access information, the memory access information including a data value-related parameter determined based on a data bit indicative of an original data value in the data block and / or an inter-request delay-related parameter determined based on a request time of the memory access request; based on the data value-related parameter and / or the inter-request latency correlation parameter predicts that a read access to another location in the memory array will be subsequently requested by another memory access request; and indicates that the memory array is Another block of data stored at the other location is output prior to the access request to a different memory tier that provides faster data access speeds.

Description

technical field [0001] The present invention relates generally to computing systems, and more particularly, to memory interfaces implemented in computing systems. Background technique [0002] In general, a computing system includes a processing subsystem and a memory subsystem, which can store data accessible to processing circuitry of the processing subsystem. For example, in order to perform operations, processing circuitry may execute corresponding instructions retrieved from memory devices implemented in the memory subsystem. In some examples, data input to an operation may also be retrieved from a memory device. Additionally or alternatively, data output from an operation (eg, produced by an operation) may be stored in a memory device, eg, to enable subsequent retrieval. However, in at least some examples, the operational efficiency of a computing system may be limited by its architecture, which, for example, governs the sequence of operations performed in the comput...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G11C7/10G06F12/0893
CPCG11C7/1057G11C7/1084G06F12/0893G06F3/061G06F3/0679G06F3/0653G06F3/0658G06F12/0862G06F2212/6026G06F2212/6024G06F3/0622G06F3/0688G06F3/064G06F3/0649G06F3/0611
Inventor D·A·罗伯茨
Owner MICRON TECH INC
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More