Cache partitioning method and device

A physical memory and target technology, applied in the field of Cache partition, can solve problems such as memory access delay, lower memory access performance, interference, etc., to achieve the effect of reducing interference and improving memory access performance

Active Publication Date: 2015-11-25
HUAWEI TECH CO LTD +1
View PDF5 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] However, in some work scenarios, such as startup scenarios or device direct memory access scenarios, the operating system needs to allocate continuous physical memory to the execution entity, and continuous physical memory often spans multiple different CacheSets, requiring continuous physical memory It is very likely that the execution entity shares the same one or more CacheSets with other execution entities. In this case, multiple execution entities will still interfere with the use of Cache, causing memory access delays, thereby reducing memory access performance.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cache partitioning method and device
  • Cache partitioning method and device
  • Cache partitioning method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0064] The technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only a part of the embodiments of the present invention, rather than all the embodiments. Based on the embodiments of the present invention, all other embodiments obtained by those of ordinary skill in the art without creative work shall fall within the protection scope of the present invention.

[0065] The technical solution provided by the embodiment of the present invention is applied to a host, and the operating system running on the host can flexibly allocate physical memory and CacheSet (cache set) for the execution entity. Among them, the host may include a multi-virtual machine operating system under a virtual machine system, a multi-core operating system under a many-core hardware platform, or a multi-process scheduling s...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

Disclosed are a Cache partitioning method and device, which relate to the field of electronic information technology, and can flexibly allocate physical memories and Cache Sets for executive entities, thereby reducing interference generated by a plurality of executive entities in the usage of a Cache and improving the performance of memory access. The method comprises: allocating, by an operating system, a physical memory to an executive entity (101); and selecting one or more unoccupied Cache Sets from Cache Sets included in a host, caching data from the physical memory allocated to the executive entity into the selected Cache Set, and establishing a correlation between the physical memory allocated to the executive entity and the selected Cache Set (102). The method is suitable for a scenario where an appropriate Cache Set is allocated to an executive entity.

Description

Technical field [0001] The present invention relates to the field of electronic information technology, and in particular to a method and device for Cache partitioning. Background technique [0002] In cloud computing and data center applications, in order to make full use of resources, there are often multiple execution entities in a single node, such as processes, virtual machines, or kernels in multi-core operating systems. The operating system needs to allocate physical memory to each execution entity. The Cache index method in the prior art determines the corresponding relationship between physical memory and Cache (high-speed buffer memory). After physical memory is allocated, it is equivalent to allocating Cache for the execution entity. When the execution entity is running, each execution entity will interfere with the use of the Cache. For example, a process frequently refreshes the Cache, which reduces the success rate of other processes accessing the Cache, resulting i...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F12/08
CPCG06F12/08
Inventor 郑晨高云伟詹剑锋张立新
Owner HUAWEI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products