Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A memory management method and device

A memory management and memory technology, applied in the field of memory management, can solve the problems of occupying memory blocks, low memory utilization, and increasing the operating burden of system resources.

Inactive Publication Date: 2016-08-03
ZTE CORP
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The static memory management method divides the memory into several areas in advance, and then divides the areas into multiple memory blocks of equal size; when there is a memory application, if the pre-applied memory size is equal to the memory block or area memory size or Similar, the memory utilization rate is high; however, if the pre-applied memory is small, the small pre-applied memory occupies a large memory block, and the memory utilization rate is low due to poor flexibility.
[0005] The dynamic memory management method is usually implemented by a linked list or a stack method; among them, the linked list method uses a linked list to arrange free memory blocks. When a process needs to apply for memory, it searches for free memory blocks from the head of the linked list until it finds a suitable size. memory block, which allocates the searched memory block to the process applying for memory; when a memory block is released, the linked list method puts the released memory block, that is, the free memory block, at the end of the linked list; the stack method is similar to the linked list method. Free memory blocks are arranged on the top of the stack. When a process applies for memory, it searches from the top of the stack until a memory block of a suitable size is found; when a memory block is released, the released memory block, that is, the free memory block, is placed on the stack Top; In embedded systems, dynamic memory management methods are widely used. However, because embedded systems need to continuously apply for and release memory, that is, embedded systems need to continuously allocate memory dynamically, which increases the resources of the system itself. operating burden

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A memory management method and device
  • A memory management method and device
  • A memory management method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0067] Before describing the memory management method of the present invention, the memory pool management list of the present invention is described first, as figure 1 As shown, the highest-order memory pool is the 12th-order memory pool as an example for illustration. The memory pool management list divides the memory into 0-order (Order) memory pools-12th-order memory pools, a total of 13 order memory pools, Each level of memory pool has at least one management node, and each management node manages 256 memory blocks; in essence, the application memory is at least one of the 256 memory blocks managed by the application management node. After the application is successful, the occupied memory A block is called an applied memory block; releasing memory means releasing at least one of the 256 memory blocks managed by the management node, and the released memory block is called a free memory block, and the free memory block can be used for the next memory application.

[0068] ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Disclosed is a memory management method. A memory is divided into more than one order of memory pools, and each order of memory pool has at least one management node. In the case of memory application, an order of memory pool is determined according to the size of a memory for which the application is made, and then the memory is allocated to the memory application according to the occupation condition of the order of memory pool in a memory pool management list; and in the case of memory release, according to label information about a memory which needs to be released, the memory is released. Also disclosed at the same time is a memory management device. Using the technical solution of the present invention, a memory block has a variable fundamental length, a high flexibility, and a high utilization rate of memory resources, and can be applied to embedded systems or operating systems with different demands for memory resources.

Description

technical field [0001] The invention relates to memory management technology, in particular to a memory management method and device. Background technique [0002] Due to the wide application of embedded systems, how to efficiently manage the memory of embedded systems has become a research hotspot. [0003] At present, in the embedded system, there are mainly two memory management methods: one is a static memory management method, and the other is a dynamic memory management method; here, memory management mainly involves memory application and memory release; among them, [0004] The static memory management method divides the memory into several areas in advance, and then divides the areas into multiple memory blocks of equal size; when there is a memory application, if the pre-applied memory size is equal to the memory block or area memory size or Similarly, the memory utilization rate is high; however, if the pre-applied memory is small, the small pre-applied memory oc...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F12/02G06F17/30
CPCG06F12/0223
Inventor 刘强
Owner ZTE CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products