Share cache perception-based virtual machine scheduling method and device

A technology of shared cache and scheduling method, which is applied in the computer field, can solve problems such as system operating efficiency decline and virtual machine performance mutual interference, and achieve the effects of improving operating efficiency, reducing performance mutual interference, and facilitating QoS goals

Inactive Publication Date: 2014-09-17
HUAWEI TECH CO LTD
View PDF6 Cites 22 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0009] Embodiments of the present invention provide a virtual machine scheduling method and device based on shared cache perception, so as to solve as much as possible the problem that the

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Share cache perception-based virtual machine scheduling method and device
  • Share cache perception-based virtual machine scheduling method and device
  • Share cache perception-based virtual machine scheduling method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0052] An embodiment of the present invention provides a virtual machine scheduling method based on shared cache awareness.

[0053] The method is used in a computer system, and the system includes a hardware layer, a virtual machine management layer running on the hardware layer, and a plurality of virtual machines running on the virtual machine management layer, and the hardware layer It includes a plurality of nodes, the at least one virtual machine runs on the plurality of nodes, each node includes at least one processor core (core), and the plurality of nodes has a shared cache. Generally, the shared cache is the last-level cache, that is, the last-level cache of each node is used as a shared cache and can be accessed by other nodes.

[0054] The so-called cache is a temporary storage located between the CPU and the memory. Its capacity is much smaller than that of the memory but the switching speed is much faster than that of the memory. The data in the cache is a small...

Embodiment 2

[0106] In order to better implement the above solutions of the embodiments of the present invention, related devices for coordinating the implementation of the above solutions are also provided below.

[0107] Please refer to Figure 7 , an embodiment of the present invention provides a virtual machine scheduling device based on shared cache perception, which is used in a computer system, and the computer system includes a plurality of nodes, each node includes at least one processor core, and the last level cache of each node To share the cache, multiple virtual machines run on the multiple nodes; the device may include:

[0108] A determining module 740, configured to determine a plurality of virtual machines that need to be scheduled;

[0109] An acquisition module 710, configured to acquire usage of all shared caches by each virtual machine among the multiple virtual machines that need to be scheduled;

[0110] A clustering module 720, configured to cluster the multiple ...

Embodiment 3

[0138] An embodiment of the present invention also provides a computer-readable medium, including computer-executable instructions, so that when a processor of a computer executes the computer-executable instructions, the computer executes the shared cache-aware-based virtual machine scheduling method of Embodiment 1. method flow.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a share cache perception-based virtual machine scheduling method and device, and aims to solve the problem that the performance between virtual machines is mutually interfered caused by resource competition of share caches, and thus the operating efficiency of the system is degraded. In some feasible modes of execution of the invention, the method comprises the steps: determining multiple virtual machines in need of scheduling; obtaining the use condition of all the share caches by each virtual machine of the multiple virtual machines in need of scheduling; clustering the multiple virtual machines in need of scheduling into multiple categories according to the use condition of all the share caches by each virtual machine; assigning virtual machines in each category uniformly to multiple nodes in sequence so that the share cache use condition of the one or more than one virtual machine assigned to each node is matched; the multiple virtual machines in need of scheduling are scheduled to respective assigned modes.

Description

technical field [0001] The present invention relates to the field of computer technology, in particular to a virtual machine scheduling method and device based on shared cache awareness. Background technique [0002] With the development of multi-core technology, virtualization technology, and cloud computing technology, multi-core processors are widely used in cloud computing environments. In the cloud computing scenario, through virtualization technology, multiple virtual machines can run on multi-core servers at the same time, which can fully and flexibly utilize physical resources. However, if multiple virtual machines run on the same multi-core physical server at the same time, the overall operating efficiency of the system will decrease due to competition among virtual machines for shared resources, and the performance index of a single virtual machine will also be affected accordingly. The expected Quality of Service (QoS) target cannot be achieved. [0003] In a mu...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F9/50G06F9/455
Inventor 藏洪永
Owner HUAWEI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products