Unlock instant, AI-driven research and patent intelligence for your innovation.

Method for supervisory unit driving cache

A technology for device driving and device management. It is used in electrical digital data processing, memory address/allocation/relocation, instruments, etc. It can solve the problems of waste of memory resources, full-load operation, and low cache utilization, and improve data throughput. The effect of efficiency

Inactive Publication Date: 2012-05-09
ZTE CORP
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

If the number of driver caches is forcibly limited, it will limit the processing capacity of the system equipment in extreme situations, and cannot achieve the effect of "exchanging space for time", so that the performance of the system equipment cannot be further improved.
Moreover, in fact, the device IO capability that the system can handle is certain, and it is rare for all devices to run at full capacity at the same time.
The method of providing as much cache as possible for each device driver to improve performance often shows that the cache utilization rate is not high in practice, the cache configuration of each driver is unreasonable under certain conditions, and the memory resources are seriously wasted.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for supervisory unit driving cache
  • Method for supervisory unit driving cache
  • Method for supervisory unit driving cache

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0023] The device driver buffer management method proposed by the present invention manages all device driver buffers in the system in a unified manner, and supports dynamic expansion of device buffers and multiplexing among different device buffers. The device driver cache management method based on scalability and priority management provided in the present invention, combined with the dynamic memory management of the operating system itself, is applied in the cache management of the device driver to ensure that the free memory resources in the system can be driven by the device when the device is busy. Make use of it to ensure that relatively important and real-time device drivers can use as much system memory resources as possible. In this way, when the system is at full load, all memory resources in the system can be fully used, and important and real-time devices can be prioritized to quickly respond to system device IO requests, while secondary and low-real-time devices ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for management equipment driving cache, comprising that initiation configuration of the equipment driving cache is carried out; memory in the system memory is applied so as to extend the equipment driving cache as the equipment driving cache is not enough when system is running. The method of the invention is adopted to unify all equipment driving caches in the management system, support the dynamic extension of the equipment cache and the repeated use in different equipment caches, lead the memory resource in the system to be sufficiently used and improve thedata throughput of the system equipment.

Description

technical field [0001] The invention relates to device management and memory management of an operating system, in particular to a method for managing device drive cache. Background technique [0002] The device driver is the most important part of the device management of the operating system, and it undertakes the functions of input and output, data exchange, data storage, data communication, and debugging tools in the system. [0003] As the memory data backup of the system device input / output (IO, Input and output), the driver cache can reduce the frequency of reading and writing to the device I / O during system operation, thereby improving the throughput efficiency of the device, and achieving "space In exchange for time" effect. In essence, cache is nothing more than memory organized according to a certain rule and organized in a unified management manner, which is mainly used in device driver design. As an acceleration tool for system device data exchange, cache play...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F13/10G06F12/02
Inventor 刘忱周元庆吴应祥李忠雷
Owner ZTE CORP