Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A method, device and system for cross-page prefetching

A prefetching and cross-page technology, applied in the computer field, can solve the problems of low prefetching hit rate and low memory access efficiency of prefetching devices.

Active Publication Date: 2017-04-05
HUAWEI TECH CO LTD +1
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

But in fact, 50%-70% of the physical pages mapped by the virtual pages are actually continuous, that is, the determined prefetch address is valid
Based on the method of the prior art, when cross-page prefetch is prevented from being invalid, cross-page prefetch is also prevented from being effective, resulting in a low prefetch hit rate of the prefetch device, which in turn reduces the efficiency of accessing memory

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A method, device and system for cross-page prefetching
  • A method, device and system for cross-page prefetching
  • A method, device and system for cross-page prefetching

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0095] An embodiment of the present invention provides a cross-page prefetching method, specifically as figure 2 As shown, the method includes:

[0096] 201. Receive a physical address missing indication message sent by the cache register, where the indication message carries the mapped first physical address and continuous information about the first physical page to which the first physical address belongs, wherein the first The continuous information of the physical page is used to indicate whether the physical page mapped by the first virtual page and the physical page mapped by the next virtual page continuous with the first virtual page are continuous, and the first virtual page contains the first physical address The first virtual address of the mapping.

[0097] It should be noted that, usually when the processor does not hit instructions or data from the cache register, in order to improve the speed of reading instructions or data, the cache register usually reads i...

Embodiment 2

[0166] An embodiment of the present invention provides a cross-page prefetching method, specifically as Figure 7 The illustrated cross-page prefetch system is described below, and the cross-page prefetch system includes a processor, a memory, a cache register, a bypass conversion buffer, and a prefetch device.

[0167] In order to achieve cross-page prefetch, first, when the application program is loaded, the processor allocates the program instruction and data storage space for the application program in the memory, specifically as Figure 8 shown, including:

[0168] 801. The processor receives a first indication message for applying for a physical memory space, where the first indication message carries capacity information of the memory space.

[0169] 802. The processor allocates physical memory space and virtual memory space according to the capacity information of the memory space.

[0170]Specifically, when the application program is loaded, the processor first rece...

Embodiment 3

[0210] An embodiment of the present invention provides a prefetching device 1200, specifically as Figure 12 As shown, the prefetching device 1200 includes a receiving unit 1201 , an obtaining unit 1202 , and a prefetching unit 1203 .

[0211] The receiving unit 1201 is configured to receive a physical address missing indication message sent by the cache register, where the indication message carries the mapped first physical address and continuous information of the first physical page to which the first physical address belongs, Wherein, the continuity information of the first physical page is used to indicate whether the physical page mapped by the first virtual page and the physical page mapped by the next virtual page continuous with the first virtual page are continuous, and the first virtual page includes The first virtual address mapped to the first physical address.

[0212] The obtaining unit 1202 is configured to obtain a prefetch address according to the first phy...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Embodiments of the present invention provide a cross-page prefetching method, apparatus, and system, which can improve a prefetching hit ratio of a prefetching device, and further improve efficiency of memory access. The method includes: receiving an indication message, sent by a cache, that a physical address is missing, where the indication message carries a mapped-to first physical address and contiguity information of a first physical page to which the first physical address belongs; acquiring a prefetching address according to the first physical address and a step size that is stored in a prefetching device; and if a page number of a physical page to which the prefetching address belongs is different from a page number of the first physical page, and it is determined, according to the contiguity information of the first physical page, that the first physical page is contiguous, prefetching data at the prefetching address.

Description

technical field [0001] The present invention relates to the field of computers, in particular to a method, device and system for prefetching across pages. Background technique [0002] With the development of the times, there are more and more performance differences between CPU (Central Processing Unit, central processing unit) and memory. Many technologies have been developed to try to reduce the delay caused by accessing memory, such as using pipelines, multithreading, etc. etc. Data prefetching is also one of the techniques. [0003] Data prefetching is a method of taking the subsequent data A out of the memory and storing it in the Cache (cache register) in advance before the processor accesses the memory, so that when the CPU accesses A, because A is already in the Cache , so it can be accessed directly, reducing the delay caused by the CPU accessing the memory to find A. [0004] The addresses used in the instructions or data that the application calls during execut...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F12/0862
CPCG06F12/0862G06F2212/1016G06F2212/6026G06F2212/1021G06F2212/602G06F2212/608
Inventor 张立新张柳航侯锐张科
Owner HUAWEI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products