Check patentability & draft patents in minutes with Patsnap Eureka AI!

GPU memory buffer prefetch and pre-backup signaling to avoid page faults

A buffer and memory technology, applied in instrumentation, data transformation, image memory management, etc., can solve problems such as GPU processing inefficiency, lack of technology to stop and resume highly parallel jobs, etc.

Inactive Publication Date: 2017-03-08
QUALCOMM INC
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

GPU processing inefficiencies can often occur during memory accesses due to the lack of techniques for stopping and resuming highly parallel jobs executing on the GPU

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • GPU memory buffer prefetch and pre-backup signaling to avoid page faults
  • GPU memory buffer prefetch and pre-backup signaling to avoid page faults
  • GPU memory buffer prefetch and pre-backup signaling to avoid page faults

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0017] The present invention relates to techniques for graphics processing, and more particularly to techniques for prefetch and prebackup signaling from a graphics processing unit for avoiding page faults in a virtual memory system.

[0018] A modern operating system (OS) running on a central processing unit (CPU) typically uses a virtual memory scheme to allocate memory to the multiple programs operating on the CPU. Virtual memory is a memory management technique that virtualizes a computer system's physical memory (eg, RAM, disk storage, etc.) such that application requirements refer to only one set of memory (ie, virtual memory). Virtual memory consists of contiguous address spaces that map to locations in physical memory. In this way, segments of physical memory are "hidden" from application programs, which can instead interact with contiguous blocks of virtual memory. Contiguous blocks in virtual memory are usually arranged into "pages." Each page is some fixed-length ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention proposes the technology for the Perseveram adjustment for the IO device (for example, GPU), which uses pre -extraction and backup notification event signals to reduce the delay of the request -regulating delay.The page error is limited by performing the requesting page adjustment operation before the IO device actually requests the unpaired memory.

Description

technical field [0001] The present invention relates to techniques for graphics processing, and more particularly to techniques for prefetch and prebackup signaling from a graphics processing unit (GPU) for avoiding page faults in a virtual memory system. Background technique [0002] Visual content for display (eg, graphical user interfaces and content for video games) may be generated by a graphics processing unit (GPU). A GPU can convert two-dimensional or three-dimensional (3D) objects into a displayable two-dimensional (2D) pixel representation. Additionally, GPUs are increasingly being used to perform certain types of computations that are efficiently handled by the highly parallel nature of GPU cores. Such applications are sometimes referred to as general-purpose GPU (GPGPU) applications. Converting information about 3D objects into displayable bitmaps and large GPGPU applications requires considerable memory and processing power. GPU processing inefficiencies can ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T1/60
CPCG06T1/60G06F5/14
Inventor 科林·克里斯托弗·夏普戴维·里赫尔·加西亚·加西亚埃杜瓦杜斯·A·梅茨
Owner QUALCOMM INC
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More