Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Memory data prefetch method and device

A memory data and memory technology, applied in the field of memory data prefetching methods and devices, can solve the problems of increasing memory data return delay, increasing memory power consumption, etc., and achieve the goals of reducing cross conflicts, shortening return delay, and reducing power consumption Effect

Active Publication Date: 2017-06-23
LOONGSON TECH CORP
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] For multiple sets of stream buffer structures, when there is a memory access conflict between memory access requests for data streams, multiple memory access conflict memory access requests interleaving access memory will result in a large number of page opening and page closing operations, increasing the time when memory data is returned latency while increasing memory power consumption

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Memory data prefetch method and device
  • Memory data prefetch method and device
  • Memory data prefetch method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0056] In order to make the purpose, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the drawings in the embodiments of the present invention. Obviously, the described embodiments It is a part of embodiments of the present invention, but not all embodiments. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without creative efforts fall within the protection scope of the present invention.

[0057] figure 2 It is a flow chart of Embodiment 1 of the memory data prefetching method of the present invention. Such as figure 2 As shown, the method provided in this embodiment may be specifically executed by a memory data prefetching device, and the memory data prefetching device may be set at a position close to the cache, for example, m...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a method and a device for prefetching memory data. The method includes the following steps of sending a prefetching instruction to a memory; respectively writing N prefetching addresses in each item of an N-item first in first out (FIFO) queue; acquiring N data items stored in the N prefetching addresses from the memory in one time; and storing the N data items in each item of the N-item FIFO queue. N is greater than or equal to 1. The N-item FIFO queue can be filled only through one prefetching instruction, so that cross conflicts of multiple fetching requests from multiple FIFO queues in the prior art can be greatly reduced. Compared with the prior art, the method and the device have the advantages that at most N-1 times of page conflicts can be reduced, so that return time delay of the memory data can be greatly shortened, and consumption of memory power can be reduced.

Description

technical field [0001] The invention relates to computer storage technology, in particular to a memory data prefetch method and device. Background technique [0002] Prefetching technology is to use the time and space locality of the processor's memory access behavior to predict the required data before the processor cache memory is missing, and to retrieve it in advance to a location closer to the cache memory. When missing, check the prefetched address first, and return the data directly if it hits, which can greatly reduce memory access delays, reduce pipeline stalls caused by memory cache misses, and improve system performance. [0003] In order to reduce the impact on processor hardware design, a stream prefetch module (stream buffer) is usually added to the memory controller at present. Its main component is a first-in-first-out queue (FIFO, First Input FirstOutput). FIFO generally includes N entries, each containing an address location, a valid bit, and data for a ca...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F12/02G06F12/0862
CPCY02D10/00
Inventor 黄帅王焕东陈新科陈厦
Owner LOONGSON TECH CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products