Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method and apparatus for scheduling memory

A memory scheduling and memory technology, applied in the computer field, can solve problems such as multiple redundant operations, affecting Cache space usage and processing speed, and redundant memory data, so as to optimize Cache scheduling, improve utilization and The effect of the speed at which data is processed

Active Publication Date: 2011-10-26
UNITED INFORMATION TECH H K COMPANY
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Different businesses are suitable for different cell space scheduling strategies and cell space management strategies. A single cell space scheduling strategy and cell space management strategy may lead to more redundant data in memory and more redundant operations.
[0007] To sum up, the existing technology adopts the same memory scheduling method for the entire Cache, so that the use of Cache is limited to a certain extent, which affects the utilization rate of Cache space and the speed of processing data.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and apparatus for scheduling memory
  • Method and apparatus for scheduling memory
  • Method and apparatus for scheduling memory

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] The embodiment of the present invention establishes a Cache instance for each service, so as to adopt different memory scheduling methods for different services, especially adopt a memory scheduling strategy suitable for the service. This memory scheduling method improves the utilization rate of memory and the speed of processing data. The embodiment of the present invention is mainly applicable to the scheduling of the data area of ​​the memory.

[0027] see figure 1 , the main method flow of memory scheduling in this embodiment is as follows:

[0028] Step 101: Run the Cache instance corresponding to the service type requested by the user.

[0029] Step 102: Using the Cache instance, schedule the Cache according to the value of the scheduling parameter configured in advance for the type of service.

[0030] In this embodiment, Cache scheduling includes a unit space division policy, a unit space scheduling policy, a unit space management policy, and a data reading p...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention discloses a memory dispatching method which is used for realizing the optimization of a memory (Cache) dispatching strategy and improving the utilization rate of Cache resources and the speed of data processing. The method comprises the steps of operating a Cache example corresponding to the type of the business required by users, wherein different memory examples are preestablished for different business, and dispatching Cache according to a value of a dispatching parameter configured for the business of the type in advance through the Cache example. The present invention further discloses a device and a system used to realize the method.

Description

technical field [0001] The invention relates to the field of computers, in particular to a memory scheduling method and device. Background technique [0002] Memory (Cache) has become an indispensable part of storage devices. [0003] The main process of service data reading in the prior art is: user requests service data; Search the service data of request in Cache; etc.) to read the business data and load it into the memory for access by the CPU. The entire memory schedules the memory through a Cache instance. [0004] Wherein, when reading business data from the main memory and loading it into the memory, a data reading strategy is adopted for the business data of various businesses. If the read-ahead strategy is adopted for all businesses, it may cause data redundancy; if the strategy of no read-ahead is adopted for all businesses, the main memory may be accessed frequently, affecting the data processing speed. [0005] Cache is generally divided into multiple contig...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F9/50G06F12/02
Inventor 薛国良
Owner UNITED INFORMATION TECH H K COMPANY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products