Distributed cache live migration

A cache and cache element technology, applied in memory systems, memory architecture access/allocation, instruments, etc., can solve the problem of high latency of data operations and reduce bandwidth overhead.

Active Publication Date: 2018-06-08
HUAWEI TECH CO LTD
View PDF11 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

At that point, retrieving all data from memory results in high latency for data operations

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Distributed cache live migration
  • Distributed cache live migration
  • Distributed cache live migration

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0046] figure 1 Shown is a first host 110A, also referred to as the original host, and a second host 110N, also referred to as the destination host.

[0047] The first host 110A is shown with VM 120A, but it may include one or more other VMs (not depicted). In addition, the first host 110A includes a first cache 130A and a first memory 140A. The first cache 130A may be distributed among multiple hosts. Data related to not only VM 120A may be included, but also data related to one or more different VMs. Data from the first memory 140A may be cached in the first cache 130A, and may also contain data cached by a different cache and / or data associated with a different VM other than VM 120A. figure 1 Also shown is global memory 150, which may be local or remote memory of the first host 110A.

[0048] Caching media for caching discussed herein includes, but is not limited to, SSD, RAM, and flash memory.

[0049] The global storage 150 may be a cluster of storage arrays, such as...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

A system, comprising: a first host comprising a first cache and associated with a virtual machine, VM; and a second host comprising a second cache; wherein the first host is adapted to send cache dataof the first cache to the second host in response to a notification, said cache data associated with the VM and said notification indicating that the VM is to be migrated from the first host to the second host, and wherein the first host is adapted to send write operations associated with the VM to the second host in response to receiving the notification; and wherein the second host is adapted to apply, in response to receiving the notification, read operations associated with cache data of the VM to the first cache if said cache data is not present in the second cache, and to apply write operations associated with cache data of the VM to the second cache.

Description

technical field [0001] The present application relates to the field of distributed cache, and in particular to a method, device and system for migrating cache. Background technique [0002] Virtualization refers to the concept of abstracting the resources provided by computing devices. Applications, including operating systems, can run within virtual machines (VMs) that appear to applications as a single physical host. Multiple VMs can exist on the same physical host. Thus, virtualization requires maintaining for each VM the illusion that it has exclusive access to resources associated with that VM. Such resources are called virtual resources. The VM can be managed by a virtual machine manager (virtual machine manager, VMM) on the same host as the VM. [0003] Resources existing in a virtualization environment may include virtual CPU, virtual memory, virtual I / O devices, and the like. One aspect of virtualization is storage virtualization, where resources correspond to ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F12/0842G06F9/455G06F12/0866G06F12/0868
CPCG06F12/0842G06F12/0866G06F12/0868G06F9/45558G06F2212/151G06F2009/4557H04L67/5682G06F12/084G06F12/0891G06F2009/45583G06F2009/45595H04L67/1097
Inventor 伊戈尔·维亚切斯拉维奇·德鲁日宁米哈伊尔·叶夫根耶维奇·梅索夫米哈伊尔·弗勒尔维奇·泽恩科维奇
Owner HUAWEI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products