Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Virtual Memory Management System with Reduced Latency

Active Publication Date: 2014-07-24
WISCONSIN ALUMNI RES FOUND
View PDF6 Cites 29 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

The present invention provides a computer architecture that combines the benefits of virtual memory with the speed of direct memory access. It allows for faster data accesses by bypassing the conventional TLB and / or page table. The invention also provides flexibility in the amount of memory allocated for bypass access and allows for extremely rapid discrimination between virtual memory portions. Overall, it reduces latency and improves performance for data-intensive applications.

Problems solved by technology

In the later case when necessary information is unavailable in TLB, significant delay is possible.
Thus, the indirection of accessing memory by looking up the TLB and / or the page table can significantly delay the speed of memory access.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Virtual Memory Management System with Reduced Latency
  • Virtual Memory Management System with Reduced Latency
  • Virtual Memory Management System with Reduced Latency

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0036]Referring now to FIG. 1, a multiprocessor computer system 10 suitable for practice of the present invention may include one or more processors 12a and 12b communicating with a memory system 14, the latter including, for example, physical memory and one or both of solid-state and disk storage devices.

[0037]Each processor 12 may include a processor unit 16 for executing instructions of programs (for example read from the memory system 14) to operate on data read from the memory system 14 which provides argument to instructions of programs. This execution of the instructions then produces data values which may be written to the memory system 14. The different processors 12a and 12b may execute the same or different programs and each program may include multiple processes, individual or multiple of which may be executed by a given processor 12.

[0038]Access to the memory system 14 by the processor unit 16 is mediated through a memory access circuit 18 in turn communicating with a c...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A computer system using virtual memory provides hybrid memory access either through a conventional translation between virtual memory and physical memory using a page table possibly with a translation lookaside buffer, or a high-speed translation using a fixed offset value between virtual memory and physical memory. Selection between these modes of access may be encoded into the address space of virtual memory eliminating the need for a separate tagging operation of specific memory addresses.

Description

STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT[0001]This invention was made with government support under 1117280 and 0916725 awarded by the National Science Foundation. The government has certain rights in the invention.CROSS REFERENCE TO RELATED APPLICATION[0002]Not ApplicableBACKGROUND OF THE INVENTION[0003]The present invention relates to computer architectures and, in particular, to a method and apparatus for managing virtual memory in a computer system in order to reduce memory access latency.[0004]During the execution of the computer program, a computer processor may access a memory to read data needed as the arguments for executing instructions or to write data produced by the execution of those instructions. The accessed data may be identified by a unique physical address of a memory location holding that data.[0005]Modern computer systems may hide the physical addresses of data accessed by a program by mapping those physical addresses to a virtual address ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F12/10
CPCG06F12/1027G06F12/10
Inventor BASU, ARKAPRAVAHILL, MARK DONALDSWIFT, MICHAEL MANSFIELD
Owner WISCONSIN ALUMNI RES FOUND
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products