Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Engineering Computer Memory Integrated Management System

A comprehensive management and computer technology, applied in the direction of calculation, program control design, multi-program device, etc., can solve the problems of reducing performance, increasing cache errors, etc., to achieve the effect of ensuring stability, reducing cache errors, and wide application range

Inactive Publication Date: 2019-01-18
王能武
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to overcome the existing useful data in the CPU cache will be replaced by the prefetched content when the current CPU is prefetching, thereby increasing the defects of cache errors and reducing performance, and providing a method that can effectively solve the above defects An Engineering Computer Memory Integrated Management System

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Engineering Computer Memory Integrated Management System
  • Engineering Computer Memory Integrated Management System

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0026] Such as Figure 1~2 As shown, the CPU independent chip 100 in the CPU system of the present invention integrates a CPU core 110, a secondary cache 130, a memory access controller MMU 140 and four memory channels. The CPU core 110 is built with a CPU execution mechanism 116 , a first-level instruction cache 112 (ie, L1-I Cache), and a first-level data cache 114 (ie, L1-D Cache). The secondary cache 130 directly exchanges data with the CPU core 110, and the four memory channels (i.e. memory channel one 152, memory channel two 154, memory channel three 156 and memory channel four 158) communicate with the memory access controller MMU 140 to accept its management instructions.

[0027] The memory access controller MMU 140 exchanges data with the instruction and data filling mechanism of the CPU core 110 . figure 1 The first cache of the independent CPU chip 100 in the CPU adopts a separate storage structure for instructions and data: instructions are stored in the first-l...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an engineering computer memory comprehensive management system. The buffer system comprises more than one buffer line, and each buffer line contains one or more data words. Atthe same time, each buffer line has a hit Hit identification bit. The data word of the buffer line is currently in used or prefetching. The management steps are: when the cache line is loaded, the hitHit ID bit is set to 0; when the cache line hits, the hit Hit flag bit is set to 1; when replacing, first replacing the cache line that hits the Hit ID bit of 0, and then replacing the cache line that hits the Hit ID bit of 1. The invention can ensure that the CPU system has the prefetching capability of instructions and data, and can issue a request in advance to fetch the instructions and datafrom the memory or other storage mechanism back to the CPU before the instructions and data are actually used, so that the operation speed can be remarkably improved.

Description

technical field [0001] The invention relates to a management algorithm of a computer cache system, in particular to a management method of a CPU cache system. Background technique [0002] Currently, computer systems experience significant delays in accessing devices such as memory and other low-level storage devices such as hard disks and network devices. Taking memory access as an example, it takes about 100 nanoseconds to get the data after the CPU issues the access commands for data and instructions, which is equivalent to the time for the CPU core to execute hundreds of instructions. Since the CPU system has certain rules for the use of instructions and data, according to these rules, we can design various means to guess the instructions and data that the CPU will use, and prefetch these contents to the CPU in advance for backup. In this way, when the CPU wants to actually use these instructions and data, it does not need to wait and can obtain these instructions and d...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F9/50
CPCG06F9/5016
Inventor 王能武
Owner 王能武
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products