Unlock instant, AI-driven research and patent intelligence for your innovation.

Access method based on data locality

A data locality and locality technology, applied in the field of memory access based on data locality, can solve the problems of reduced cache hit rate, reduced cache efficiency, increased access time between cache and main memory, etc., to improve the cache hit rate, The effect of improving memory access performance

Active Publication Date: 2018-02-09
深圳商雀科技有限公司
View PDF11 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Taking the two data types A and B as examples, type A data has strong access locality and a small amount of data, while type B data has weak access locality and a large amount of data, which means that access locality is very small. Weak data occupies most of the cache, so that type A data is likely to be replaced by access to type B data. The next time you access type A data, it needs to be loaded from memory, and the cache hit rate is greatly reduced. The cache and main memory The access time between them increases, which reduces the cache efficiency, thereby reducing the performance of the system

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Access method based on data locality

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0020] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments.

[0021] According to the principle of locality, if the impact of accessing data with weak locality on accessing data with strong locality can be reduced, the hit rate of the cache can be improved, thereby reducing the average access time of the cache-main memory and improving the memory access of the CPU. performance. The data locality-based memory access method of the present invention is proposed based on this basic principle.

[0022] figure 1 Shown is a schematic flow chart of the data locality-based memory access method of the present invention. Before executing the memory access method of the present invention, corresponding preprocessing needs to be done first to adapt to the method proposed by the present invention. Preprocessing mainly in...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to technical fields such as computers and electronic products, and in particular to a data locality-based storage access method. This method sets the corresponding locality intensity level for each data according to its locality strength. The higher the locality intensity level, the stronger the locality of the data, and the lower the locality intensity level, the weaker the locality of the data. At the same time, for Each cache block also sets the locality strength level, and the number of locality strength levels of the cache block is set to be equal to the number of locality strength levels of the data, and the locality strength level of the data stored in each cache block is not lower than that of the corresponding cache block locality strength level, and each cache group includes at least two cache blocks with locality strength levels, thus limiting the freedom of data mapping and avoiding cache blocks with higher locality strength levels in the process of cache filling and replacement The data in is replaced by data with a lower locality level, thereby improving the cache hit rate and improving the memory access performance of the CPU.

Description

technical field [0001] The invention relates to the fields of computers and electronic products, in particular to a method for accessing storage based on data locality. Background technique [0002] Cache (Cache) is a temporary storage located between the CPU and the memory. Its capacity is smaller than that of the memory, but its access speed is much faster than that of the memory. Since the speed of the CPU is much faster than that of the memory, the cache acts as a bridge between the CPU and the memory. The CPU stores the most likely data and instructions in the cache. When the instructions and data required by the CPU are in the cache , the CPU can quickly read from it, thereby greatly improving the actual performance of the CPU. The cache solves the speed mismatch problem between the CPU and the memory to a certain extent, and it works based on the principle of program locality. The principle of program locality means that when the CPU accesses memory, whether it is a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F12/0877
Inventor 罗秋明刘国强毛睿刘刚
Owner 深圳商雀科技有限公司