Computer micro system structure comprising explicit high-speed buffer storage

A high-speed cache and architecture technology, applied in the field of computer systems, can solve the problems of invisible compilers, poor program locality, memory access misses, etc., and achieve the effect of easy identification and implementation, and fast access.

Active Publication Date: 2004-09-15
INST OF COMPUTING TECH CHINESE ACAD OF SCI
View PDF0 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Memory access misses the Cache is the main cause of delay;
[0012] 3) The Cache structure is invisible to the compiler and knows its existence, but it cannot purposefully allocate data with good reusabi

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Computer micro system structure comprising explicit high-speed buffer storage
  • Computer micro system structure comprising explicit high-speed buffer storage
  • Computer micro system structure comprising explicit high-speed buffer storage

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0031] The present invention will be described below with reference to the drawings. As shown in Figure 2, the present invention adds an Ecache to the CPU chip. Ecache is essentially a high-speed memory in the chip. The connection method of Ecache is the same as the cache in the existing computer system, one end is connected to the register, and the other end is connected to the memory in the computer system. In addition, the data transmission between Ecache and memory and registers shares the existing mechanism of the cache.

[0032] Figure 4 is a schematic diagram of the unified addressing and address space division of Ecache and memory. It can be seen from Figure 4 that the Ecache located in the CPU chip and the memory located outside the CPU chip are addressed uniformly, starting from a small address. In the implementation of accessing Ecache, first determine whether the memory access address is

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The architecture comprises memory, cache, registers and arithmetical unit, as well as comprises Ecache inside CPU chip. The said Ecache and memory are coded unitedly. Since Ecache is inside CPU. Thus, it is guaranteed that hardware accesses Ecache rapidly. United coded addresses are started from small address. Thus, in all command of accessing memories, addresses of accessing Ecache is visible, and is easy for hardware to recognize and realize. Several groups of commands are designed to support compiler and running routine to use Ecache in explicit and manage Ecache dynamically. These commands and Ecache are indiscerptible integration.

Description

technical field [0001] The present invention relates to computer systems, and in particular to computer microarchitectures containing explicit cache memory (abbreviated as Ecache). technical background [0002] In the past 50 years, computer performance has generally increased according to Moore's law, mainly relying on the technology of increasing the operating frequency of the machine and using various parallel mechanisms. Although storage technology has also developed, there is still a large gap between memory speed and processor speed. Modern computers have a level-1, level-2, or even level-3 cache memory (Cache) between memory and registers. It is hoped that the data in the cache can be reused to alleviate the contradiction of slow memory access (see Figure 1). [0003] Why can adding Cache in the chip alleviate memory access conflicts? Let's take the process of CPU reading data as an example to briefly explain the working process of Cache (see Figure 3 for the workin...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F9/30G06F12/02
Inventor 张兆庆乔如良唐志敏冯晓兵
Owner INST OF COMPUTING TECH CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products