Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Cache system and method

A cache and address cache technology, applied in the computer field, can solve the problems of low utilization rate of processor operation unit and insufficient utilization of processor operation ability.

Active Publication Date: 2015-10-14
SHANGHAI XINHAO MICROELECTRONICS
View PDF16 Cites 26 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Since data read instructions and data storage instructions do not involve arithmetic and logic operations, the greater the proportion of such instructions, the lower the utilization rate of the processor's operation unit, and the less the processor's computing power can be fully utilized.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cache system and method
  • Cache system and method
  • Cache system and method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0094] The caching system and method proposed by the present invention will be further described in detail below in conjunction with the accompanying drawings and specific embodiments. Advantages and features of the present invention will be apparent from the following description and claims. It should be noted that all the drawings are in a very simplified form and use imprecise scales, and are only used to facilitate and clearly assist the purpose of illustrating the embodiments of the present invention.

[0095] It should be noted that, in order to clearly illustrate the content of the present invention, the present invention specifically cites multiple embodiments to further explain different implementation modes of the present invention, wherein the multiple embodiments are enumerated rather than exhaustive. In addition, for the sake of brevity of description, the content mentioned in the previous embodiment is often omitted in the latter embodiment, therefore, the conten...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Provided are a cache system and method applied in the field of processors before a processor core executes an instruction, storing the instruction and corresponding data in high-speed memory directly accessible by the processor core without the need of the processor core providing an instruction address or data address; according to feedback information generated when the processor core executes the instruction, directly controlling the high speed memory to provide instruction or data to the processor core, enabling the processor core to obtain needed instructions from the high speed memory almost every time, achieving an extremely high cache hit rate; in addition, further scheduling the transmission of instructions according to the program flow information stored in the cache system; and determining the instruction issuing time according to the instruction order and distance information, realizing multiple issuing of instructions.

Description

technical field [0001] The invention relates to the fields of computer, communication and integrated circuit. Background technique [0002] In a processor system that includes a pipeline, the pipeline is usually suspended due to factors such as Cache Miss, Control Hazard, and Data Hazard, thereby affecting the performance of the processor system. [0003] Cache misses generally fall into three categories: forced misses, conflict misses, and capacity misses. Using a set-associative cache structure and increasing the number of cached way groups can reduce conflict misses, but it is limited by power consumption and speed (for example, because the multi-way set-associative cache structure requires all way groups to be addressed by the same index content and tags Read and compare at the same time), the number of road groups is difficult to exceed a certain number. Existing cache prefetching techniques can usually solve some of the problems of missing conflicts and missing capac...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F12/08
CPCG06F9/3455G06F9/3806G06F9/3808G06F9/382G06F9/383G06F9/3836G06F12/0875G06F2212/452Y02D10/00G06F9/30043G06F9/30058G06F9/3802G06F9/3804G06F9/3869G06F12/00
Inventor 林正浩
Owner SHANGHAI XINHAO MICROELECTRONICS
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products