Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Virtual memory management method and virtual memory management device for mass data processing

A technology of big data processing and virtual memory, which is applied in the direction of program control device, memory address/allocation/relocation, software simulation/interpretation/simulation, etc., and can solve problems such as low efficiency of virtual memory scheduling, inability to process, frequent transfer, etc. , to achieve high accuracy, reduce bumps, and enhance accuracy

Active Publication Date: 2014-10-22
YUNNAN UNIV
View PDF3 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the virtual memory management of the operating system cannot optimally handle this situation. The low efficiency of virtual memory scheduling, frequent call-in and call-out, and excessive data movement will cause this type of big data processing to be performed in the case of a large amount of data. , which is very slow and cannot even handle

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Virtual memory management method and virtual memory management device for mass data processing
  • Virtual memory management method and virtual memory management device for mass data processing
  • Virtual memory management method and virtual memory management device for mass data processing

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0047] The technical solutions of the embodiments of the present invention will be described in detail and completely below in conjunction with the accompanying drawings.

[0048] The invention provides a virtual memory scheduling management method, which at least includes: a memory allocation unit management method, a virtual memory scheduling matching and replacement method, and a memory allocation unit access association comprehensive index. The main workflow is described below.

[0049] Such as figure 1 As shown, in the virtual memory management method of the present invention, the memory allocation process including virtual scheduling management:

[0050] S100, when a data processing application application size is RequestSize memory, enter the process, including:

[0051] S101 judges whether the requested memory size belongs to the category of large memory, if it is a large memory, execute S102, otherwise execute S110. In the present invention, the large memory refers...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a virtual memory management method and a virtual memory management device for mass data processing, belongs to the technical field of computer system optimization and is mainly applied to one-step real-time processing of computer mass data in quantity higher than that of available physical memory. When memory allocation reaches an assigned critical value, the most suitable physical memory is found to substitute for disc virtual memory according to memory block access correlation composite index and memory allocation application size. By a memory allocation method based on size matching, a front segment of an allocation unit is allocated to an object having high requirement for memory while a rear segment is allocated to an object having low requirement for memory; thus transform matching degree of the virtual memory is increased while data movement amount is reduced. By the virtual memory management method, during allocation of mass memories which are of capacity ranging from 100KB to 10MB and of the total quantity higher than the available physical memory, Thrashing in scheduling of the virtual memory is decreased, scheduling is accurate and predictable to some degree, and system performance is improved remarkably.

Description

technical field [0001] The invention belongs to the technical field of computer system optimization, and in particular relates to a virtual memory management method in big data processing and a device thereof. Background technique [0002] Although computer hardware develops rapidly and memory capacity increases rapidly, from KB level to MB level, and then to GB level, a level increases by a thousand times, but the demand is also constantly increasing. Now that we have entered the era of big data processing, no amount of memory configuration can meet all application requirements. On the other hand, a large-capacity memory configuration also means higher hardware investment, and people always hope that a computer with a general configuration can also process a large amount of data. [0003] Compared with the capacity and cost of memory, disk has the characteristics of low cost and large capacity. Under this background, virtual memory technology has emerged. Virtual memory r...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F12/02G06F9/455
Inventor 郑家亮雷晓凌
Owner YUNNAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products