Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Memory management method and device capable of realizing memory level parallelism

A memory management and storage-level technology, applied in the field of memory management methods and devices that use storage-level parallelism, can solve problems such as limiting the performance of large-capacity main memory

Inactive Publication Date: 2012-09-12
北京北大众志微系统科技有限责任公司
View PDF3 Cites 21 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Once the data is clustered in some banks of the main memory, only some banks will be in working state when the application is executed, thus limiting the potential performance of large-capacity main memory

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Memory management method and device capable of realizing memory level parallelism
  • Memory management method and device capable of realizing memory level parallelism
  • Memory management method and device capable of realizing memory level parallelism

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] The present invention will now be described in further detail in conjunction with the accompanying drawings and preferred embodiments. These drawings are all simplified schematic diagrams, which only illustrate the basic structure of the present invention in a schematic manner, so they only show the configurations related to the present invention.

[0028] The invention utilizes the structural feature that the large-capacity main memory contains multiple banks, and divides the main memory into different groups according to the high bit chip selection address in the address. At the same time, the data with high conflict overhead is mapped to different bank leases, thus reducing memory access conflicts during application execution.

[0029] The memory management method using the large-capacity high-speed cache provided by the present invention, its process is as follows figure 2 shown, including the following steps:

[0030] 110: Divide the address space according to t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a memory management method and device capable of realizing memory level parallelism. A bank concept is introduced to a memory dispenser, and the dispenser can identity different banks according to the address range by a bank grouping building address and relevancy between banks. Data is divided into multiple different data units, and then the data is scattered to all bank groups of a main memory, so that the memory access parallelism degree is improved and the line cache collision is reduced. At the same time, the memory management device completely works in an operation system layer, analyzes the collision expense between the data units by utilizing the information provided by the compiler and the operation system, and extends the memory dispenser according to the actual allocation of the main memory, so that the application program is not needed to be amended and the memory management device is independent of the special bottom-level hardware.

Description

technical field [0001] The invention relates to a memory management method of a computer system, in particular to a memory management method and device for realizing storage level parallelism. Background technique [0002] There is a large performance gap between the processor and main memory. Cache can reduce the number of processor accesses to main memory, but large-capacity Cache is difficult to integrate on-chip, and a single Cache replacement strategy is also difficult to meet the diversity of application memory access behaviors. The processor still spends a lot of time accessing the main memory, so the main memory needs to respond to the processor's memory access request more quickly. [0003] The memory access address includes a row address and a column address, which are used to locate the memory cells in the memory array. The current main memory is constructed in a high-order crossover manner, so that row selection and column selection multiplex address lines and ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F12/06
Inventor 程旭钟祺管雪涛王晶
Owner 北京北大众志微系统科技有限责任公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products