Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Memory pool and memory allocation method

A memory allocation and memory pool technology, applied in the field of data processing, can solve problems such as failure and reduce the effective utilization of CPU Cache, and achieve the effect of improving the hit rate

Active Publication Date: 2015-12-09
NEUSOFT CORP
View PDF3 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

If the capacity of the memory pool is large, the system will frequently access a large range of memory, which will not only cause the CPU cache to fail when the system accesses the memory object, but also occupy a large amount of CPU cache resources. The valid data is swapped out, reducing the effective utilization of the CPU Cache

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Memory pool and memory allocation method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0024] The following will clearly and completely describe the technical solutions in the embodiments of the application with reference to the drawings in the embodiments of the application. Apparently, the described embodiments are only some of the embodiments of the application, not all of them. Based on the embodiments in this application, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the scope of protection of this application.

[0025] When the existing memory pool performs high-throughput network data forwarding, a large number of memory object allocation and release operations will be performed. If the capacity of the memory pool is large, the system will frequently access a large range of memory, which will not only cause the CPU cache to fail when the system accesses the memory object, but also occupy a large amount of CPU cache resources. Effective data is swapped out, reducing the effective utilizatio...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a memory pool. The memory pool comprises at least two stages of queues which are correspondingly related with memory objects in the memory pool and used for allocating and releasing the memory objects corresponding to the queues. The at least two stages of queues comprise the first stage of queue. When an external module applies for the memory objects from the memory pool, the first stage of queue is preferentially used for allocating the memory objects for the external module. The memory pool can accord with the access locality principle; even high-throughput network data forwarding is performed, frequent access of a system will not exist on a large scale; it is avoided that the Cache of a CPU loses efficacy when the system has access to the memory objects, effective data in the Cache of the CPU will not be swapped out, and the hit rate of the Cache of the CPU is increased to a large extent.

Description

technical field [0001] The invention relates to the field of data processing, in particular to a memory pool and a memory allocation method. Background technique [0002] Currently, memory objects in a traditional memory pool are allocated and released by a queue. When the memory pool is initialized, the pointers of each memory object are stored in the above queue, and when the external module applies for the memory object from the memory pool, the pointer is read from the queue and returned to the external module, so that the external module can use the pointer to obtain the memory object. When an external module releases a memory object to the memory pool, the queue enqueues a pointer to the memory object. [0003] Since the allocation and release of memory objects is based on the first-in-first-out principle of the queue, a large number of memory object allocation and release operations will be performed when performing high-throughput network data forwarding. If the c...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F12/08
Inventor 金健
Owner NEUSOFT CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products