Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Internal memory distribution, cleaning and releasing method, and internal memory management apparatus

An allocation method and memory allocation technology, applied in the field of memory management, can solve the problems of increasing the number of splitting and recycling partners, reducing the granularity of memory blocks, and increasing time overhead, so as to avoid memory oscillation, reduce the overhead of splitting and merging, and reduce Effect of Internal Fragments

Inactive Publication Date: 2008-12-10
HUAZHONG UNIV OF SCI & TECH
View PDF0 Cites 42 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although these algorithms provide more suitable memory block sizes, it seems to achieve the purpose of reducing internal fragmentation, but after careful analysis, it is not difficult to find that when more memory block sizes are provided, the system needs to increase the maintenance of new size memory blocks. Table entries, thus increasing the space overhead; at the same time, due to the finer granularity of the memory block size, the number of times to split and recycle partners is increased, which increases the time overhead; and experiments have proved that the total amount of internal fragments in the system is always maintained at 25%-40 % This constant interval does not effectively reduce

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Internal memory distribution, cleaning and releasing method, and internal memory management apparatus
  • Internal memory distribution, cleaning and releasing method, and internal memory management apparatus
  • Internal memory distribution, cleaning and releasing method, and internal memory management apparatus

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0025] The embodiment of the present invention implements an on-demand allocation strategy for memory requests to provide flexible memory block sizes, thereby achieving the purpose of reducing internal fragmentation. At the same time, a strategy of delayed merging of released memory blocks is proposed to avoid memory shock caused by frequent splitting and merging, thereby reducing the overhead of frequent splitting and merging of the system to improve system performance.

[0026] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings.

[0027] When performing memory allocation, the core idea is to allocate memory requests on demand, split the allocated internal fragments, and insert them into the free queue of memory blocks of corresponding size. The procedure of this method is as follows figure 1 Shown:

[0028] Step 101 , a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention discloses a memory distributing, clearing and releasing method and also discloses a memory management device which comprises a memory distributing module used for applying and distributing memory blocks for the memory from a vacant queue with most proper size, a fragment recovering module which is used for recovering internal chips which are produced when the memory is distributed and plugs the internal chips into different vacant queue and a memory combining module which is used for combining the memory blocks when a maximum vacant memory block can not satisfy the memory application. The present invention distributes the memory application according to need and can reduce the internal chips. The released memory blocks are processed for prolonging combination, and memory vibration caused by frequent cleavage and combination can be avoided, thereby reducing the cost for the frequent cleavage and combination of a system and improving the performance of the system.

Description

technical field [0001] The invention relates to memory management technology, in particular to a method for allocating, clearing and releasing memory and a device for memory management. Background technique [0002] With the widespread use of linux systems, the management of dynamic memory in the system has become more and more important. Because if the dynamic memory is short or improperly managed, it will cause the entire system to respond slowly, or even crash the entire system. [0003] In order to ensure that the Linux memory management mechanism can run efficiently, many technologies are used in modern operating systems, among which the buddy algorithm is more commonly used. The algorithm was first proposed by Donald E. Knuth in 1968, and it is a fast classic algorithm for dynamic memory management. [0004] In this algorithm there are multiple free queues with a block length of 2 k The free blocks of pages are all in the same queue. When you want to allocate a mem...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F12/02
Inventor 余鑫李江雄
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products