Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for managing embedded system memory

An embedded system and memory management technology, applied in the field of embedded system memory management, can solve the problems of memory block statistical characteristic allocation failure and other problems, and achieve the effect of improving the probability of successful allocation and improving the success rate of allocation.

Active Publication Date: 2010-12-08
DATANG MOBILE COMM EQUIP CO LTD
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] In view of this, the present invention provides an embedded system memory management method on the basis of the fixed partition method based on the memory pool, so as to solve the problem of allocation failure when the statistical characteristics of memory blocks are changed in existing solutions

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for managing embedded system memory
  • Method for managing embedded system memory
  • Method for managing embedded system memory

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0035] The invention fuses the memory blocks with continuous addresses, and the recovery area formed can be re-divided as the free area of ​​the memory pool. That is to say, the recovery area can be used as an effective extension of the free area to increase the probability of successful allocation. When the statistical characteristics of memory blocks change, there are sufficient reclaimed areas to meet the new requirements of memory allocation.

[0036] Embodiment one comprises the following steps:

[0037] 1. Traversing the memory pool, merging the free memory blocks with continuous addresses into the recycling area;

[0038] 2. After receiving the memory allocation request, allocate memory from the recovery area.

[0039] In the present invention, each divided SIZE memory block has a memory header structure indicating some status information, and the memory header structure includes a status bit for identifying whether the memory block is in use or idle, and also include...

Embodiment 2

[0051] As described above, when the memory block cannot be allocated from the free chain and the free area, the background sorting can be started again, which guarantees the real-time performance of the memory allocation. This embodiment focuses on further elaborating on this method.

[0052] see image 3 , is a flow chart of memory allocation in Embodiment 2. include:

[0053] Step 301: Receive a request to allocate a memory block of a certain size; for example, request to allocate a memory block of SIZE6;

[0054] Step 302: Search for a free chain, is there a free memory block with a suitable size (SIZE6)? If yes, execute step 303, otherwise, execute step 304;

[0055] Step 303: Allocate memory with a free memory block of appropriate size;

[0056] Step 304: Is the free area of ​​the memory pool enough to allocate a memory block of this size? If so, execute step 305, otherwise, execute step 306;

[0057] Step 305: allocate a memory block from the free area of ​​the me...

Embodiment 3

[0077] In order to further ensure the success rate of memory allocation, this embodiment proposes a backup pool solution.

[0078] Those skilled in the art understand that embedded real-time operating systems generally plan a plurality of memory pools, so one of them can be set as the backup pool of the remaining memory pools, and the backup pools are not allocated during the normal allocation process of the remaining memory pools. When a memory pool cannot satisfy the allocation request, the backup pool is used for memory allocation. The backup pool can pre-allocate memory blocks of various sizes preset by the system. As described in the background art, memory blocks of SIZE0-SIZE7 are pre-divided.

[0079] see you again image 3 , when the memory cannot be allocated from the free area, the free chain and the reclaimed area, the allocation in step 310 will fail. Therefore, when it is determined in step 308 that the reclaimed area cannot be used to allocate memory, the backu...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an internal memory management method of an embedded system, which comprises the following steps: a memory pool is traversed and memory blocks with continuous addresses at an idle condition are searched; idle memory blocks with continuous addresses are united to form a recovery area; and memory is distributed in the recovery area after the request for memory distribution isreceived. The recovery area arranged by the method makes an effective expansion for the idle area, thus increasing the successful probability for distributing the memory.

Description

technical field [0001] The invention relates to the technical field of embedded systems, in particular to a method for memory management of an embedded system. Background technique [0002] The memory management of the embedded operating system needs to meet the requirements of real-time and high efficiency. From the perspective of real-time performance, the memory allocation process is required to be as fast as possible. Generally, there is no segmented virtual memory management mechanism in embedded systems. Therefore, it is impossible to adopt some complex and perfect memory allocation strategies of general-purpose operating systems. It is a simple and fast memory allocation scheme. Efficiency means that memory allocation should reduce waste as much as possible. On the one hand, the cost requirements of embedded systems make memory a very limited resource. On the other hand, the limited space of the system hardware environment and the limited board area also determine T...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F12/06
Inventor 郭继燕郭长旺
Owner DATANG MOBILE COMM EQUIP CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products