Check patentability & draft patents in minutes with Patsnap Eureka AI!

Method, system and storage medium for managing data migration

A technology for managing data and data, applied in the input/output process of data processing, memory systems, electrical digital data processing, etc., can solve problems such as hindering computing operations and increasing the number of cache misses

Active Publication Date: 2020-12-04
GOOGLE LLC
View PDF8 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This overfill and flush traffic can lead to a condition known as "thrashing", where the number of cache misses increases dramatically, and the time it takes to perform cache fills and flushes due to misses can outweigh the work on the data The time it took for the set to perform the originally requested computational operation
Therefore, traditional cache replacement algorithms have flaws that hinder computational operations

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method, system and storage medium for managing data migration
  • Method, system and storage medium for managing data migration
  • Method, system and storage medium for managing data migration

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0019] overview

[0020] This article describes techniques for using profiling cache replacement and devices that support profiling cache replacement. By using these technologies and devices, system performance can be improved more than traditional cache replacement techniques, such as least recently used (LRU) algorithm, most recently used (MRU) algorithm, least frequently used (LFU) algorithm, random replacement algorithm, etc. way to manage data migration between main memory and cache memory. Performance is improved, at least in part, by reducing the number of times "thrashing" occurs, combined with migrating data between main memory and cache memory. The term "thrashing" refers to a condition caused by overfill and flush traffic that can occur when the size of the working set of data exceeds the size of the cache memory. Thrashing can cause a dramatic increase in the number of cache misses, causing the system to take longer to perform cache fills and flushes resulting fr...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

This paper describes profiling cache replacement, which is a technique for managing data migration between main memory and cache memory to improve system performance. Unlike conventional cache replacement techniques, profiling cache replacement maintains counters with a profiler that counts memory requests that access pages maintained in cache memory and pages maintained in main memory. Based on information gathered by the profiler (eg, about memory access requests) the mover moves pages between main memory and cache memory. For example, the mover may swap high requested pages of main memory (eg, most requested pages of main memory) with low requested pages of cache memory (eg, least requested pages of cache memory). The mover may do so, for example, when a counter indicates that the amount of access requests for high-request pages of main memory is greater than for low-request pages of cache memory. In order not to block memory user operations (eg, client applications) the mover performs page swapping in the background. For this reason the Mover is limited to exchanging pages at predetermined time intervals such as every microsecond.

Description

[0001] related application [0002] This application claims priority under 35 U.S.C. 119 to Provisional Application No. 62 / 293,688, filed February 10, 2016, entitled "Profiling Cache Replacement," the disclosure of which is incorporated herein by reference in its entirety. Background technique [0003] In computing, a cache is a block of memory used to temporarily store frequently accessed data and to allow future requests for cached data to be serviced faster than requests for uncached data. If the requested data is contained in the cache (a scenario known as a "cache hit"), then the request can be serviced by simply reading from the cache, which is relatively faster than accessing the data from main memory. Instead, if the requested data is not included in the cache (a scenario known as a "cache miss"), then the data is recomputed, or, in traditional techniques, the cache is populated with data from its original storage location, which Slower than simply reading data from c...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F12/122
CPCG06F12/122G06F12/0284G06F12/08G06F12/0859G06F12/0868G06F12/0897G06F12/121G06F12/123G06F2212/1021G06F2212/251G06F2212/253G06F2212/601G06F12/12G06F3/0605G06F3/061G06F3/064G06F3/0647G06F3/0665G06F12/0804G06F3/0673G06F2212/1016G06F2212/152
Inventor 张治中
Owner GOOGLE LLC
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More