Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A kind of caching system and method

一种缓存、存储器的技术,应用在计算机领域,能够解决处理器运算单元利用率低、处理器运算能力无法得到充分的发挥等问题

Active Publication Date: 2019-10-01
SHANGHAI XINHAO MICROELECTRONICS
View PDF16 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Since data read instructions and data storage instructions do not involve arithmetic and logic operations, the greater the proportion of such instructions, the lower the utilization rate of the processor's operation unit, and the less the processor's computing power can be fully utilized.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A kind of caching system and method
  • A kind of caching system and method
  • A kind of caching system and method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0094] The caching system and method proposed by the present invention will be further described in detail below in conjunction with the accompanying drawings and specific embodiments. Advantages and features of the present invention will be apparent from the following description and claims. It should be noted that all the drawings are in a very simplified form and use imprecise scales, and are only used to facilitate and clearly assist the purpose of illustrating the embodiments of the present invention.

[0095] It should be noted that, in order to clearly illustrate the content of the present invention, the present invention specifically cites multiple embodiments to further explain different implementation modes of the present invention, wherein the multiple embodiments are enumerated rather than exhaustive. In addition, for the sake of brevity of description, the content mentioned in the previous embodiment is often omitted in the latter embodiment, therefore, the conten...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention provides a cache system and method, which can fill the instruction and corresponding data into the high-speed memory directly accessible by the processor core before the processor core executes the instruction when applied in the field of processors, and do not need The processor core provides an instruction address or a data address, and directly controls the high-speed memory to provide instructions or data to the processor core according to the feedback information generated by the execution of the instruction by the processor core, so that the processor core can be used almost every time in the The required instructions are obtained in the high-speed memory to achieve a very high cache hit rate; in addition, the emission of the instruction is scheduled according to the program flow information stored in the cache system, and the time point of instruction emission is determined according to the instruction sequence and distance information , so as to realize instruction multi-issue.

Description

technical field [0001] The invention relates to the fields of computer, communication and integrated circuit. Background technique [0002] In a processor system that includes a pipeline, the pipeline is usually suspended due to factors such as Cache Miss, Control Hazard, and Data Hazard, thereby affecting the performance of the processor system. [0003] Cache misses generally fall into three categories: forced misses, conflict misses, and capacity misses. Using a set-associative cache structure and increasing the number of cached way groups can reduce conflict misses, but it is limited by power consumption and speed (for example, because the multi-way set-associative cache structure requires all way groups to be addressed by the same index content and tags Read and compare at the same time), the number of road groups is difficult to exceed a certain number. Existing cache prefetching techniques can usually solve some of the problems of missing conflicts and missing capac...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F9/38G06F12/0893
CPCG06F9/3455G06F9/3806G06F9/3808G06F9/382G06F9/383G06F9/3836G06F12/0875G06F2212/452Y02D10/00G06F9/30043G06F9/30058G06F9/3802G06F9/3804G06F9/3869G06F12/00
Inventor 林正浩
Owner SHANGHAI XINHAO MICROELECTRONICS
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products