Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method and device for reading length of data block of cache memory in self-adaption mode

A technology for caching data and data reading, applied in the field of electronics, can solve the problems of not making full use of the locality of data space, increasing the number of CPU pauses, and poor overall computer performance, so as to improve overall performance and reduce the number of times of data reading. , The effect of reducing the number of CPU pauses

Active Publication Date: 2014-11-19
INSPUR BEIJING ELECTRONICS INFORMATION IND
View PDF3 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

With the development of semiconductor technology, processors can integrate larger and larger last-level Cache, and there are many specific applications that access continuous or dense memory space addresses, so that in this case, the capacity of a single Cache line There is a phenomenon that the locality of the data space to be accessed cannot be fully utilized, so that it is frequently necessary to read data from the memory to the Cache, which increases the number of CPU pauses and reduces the processing speed of the CPU, resulting in poor overall computer performance.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and device for reading length of data block of cache memory in self-adaption mode
  • Method and device for reading length of data block of cache memory in self-adaption mode
  • Method and device for reading length of data block of cache memory in self-adaption mode

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0039] Embodiment 1, a data block length adaptive reading method of Cache, comprising:

[0040] When the last level of Cache of the processor misses, obtain the cached data information of the Cache;

[0041] According to the cached data information, it is judged whether the memory access address of the miss and the address of the cached data in the Cache are concentrated;

[0042] If it is centralized, determine the data read length suitable for the data distribution concentration, and the unit is the number of Cache lines;

[0043] The processor reads the data from the memory into the Cache according to the determined data read length centering on the memory access address of the miss.

[0044] In this embodiment, if the memory access address of the miss is relatively concentrated with the address of the cached data in the Cache, it means that the memory address missing from the current Cache is the memory hotspot address area accessed by the processor; if this happens, then...

Embodiment 2

[0076] Embodiment 2, a data block length adaptive reading device of Cache, comprising:

[0077] The cache data information obtaining unit is used to obtain the cache data information of the Cache when the processor's last level of Cache misses;

[0078] A judging unit, configured to judge whether the memory access address of the miss and the address of the cached data in the Cache are concentrated according to the cached data information;

[0079] The length determination unit is used to determine the data read length suitable for the data distribution concentration degree when concentrated, and the unit is the number of Cache rows;

[0080] The reading unit is configured to read data from the memory into the Cache according to the determined data reading length centering on the memory access address of the miss.

[0081] In an implementation manner of this embodiment, the cached data information may include address information of the cached data and a count of access times. ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a method and device for reading the length of a data block of a cache memory in a self-adaption mode. The method includes the steps that when the last Cache of a processor is missed, cache data information of the Cache is obtained; whether the missed memory access address and the address of cached data in the Cache are centralized or not is judged according to the cache data information; if yes, the read data length adapting to the data distribution concentration ratio is determined, wherein the line number of the Cache serves as the unit; the processor reads the data to the Cache from the memory with the missed memory access address as the center and according to the determined read data length. The technical problem to be solved through the method and device is to increase the processing speed of a CPU.

Description

technical field [0001] The invention relates to the electronic field, in particular to a data block length adaptive reading method and device of a cache memory. Background technique [0002] With the rapid development of semiconductor technology and computer technology, the gap between the main frequency of the processor and the frequency of the memory is getting wider and wider, resulting in a more serious memory wall problem, which has become one of the major obstacles to performance improvement in computer systems. Among them, the memory wall refers to the phenomenon that the read and write performance of the memory severely restricts the full performance of the processor. [0003] In order to reduce the adverse effects of the speed gap between the processor (CPU) and the memory, modern computer architectures widely adopt the method of setting up a cache memory (Cache) between the CPU and the memory, which may be used in a short period of time. Reused instruction code or...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F12/08G06F12/0862G06F12/0877
Inventor 陈继承王洪伟倪璠
Owner INSPUR BEIJING ELECTRONICS INFORMATION IND
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products