Check patentability & draft patents in minutes with Patsnap Eureka AI!

An instruction fetching method, device, electronic equipment and storage medium

An instruction cache and cache technology, applied in machine execution devices, program control design, electrical digital data processing, etc., can solve problems such as reducing instruction fetch efficiency and processor performance damage, and achieve improved instruction fetch efficiency, prediction technology optimization, The effect of improving processor performance

Active Publication Date: 2022-06-21
CHENGDU HAIGUANG MICROELECTRONICS TECH CO LTD
View PDF11 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, if you blindly switch to the OC path instruction fetch mode, but cannot fetch a considerable number of instructions, the performance of the processor will be greatly damaged, and the instruction fetch efficiency will be reduced instead.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • An instruction fetching method, device, electronic equipment and storage medium
  • An instruction fetching method, device, electronic equipment and storage medium
  • An instruction fetching method, device, electronic equipment and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] The embodiments of the present invention will be described in detail below with reference to the accompanying drawings.

[0027] It should be understood that the described embodiments are only some, but not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by those of ordinary skill in the art without creative efforts shall fall within the protection scope of the present invention.

[0028] At present, there are two ways to read instructions. One way is to fetch the instruction from the IC path, and the other way is to fetch the instruction from the OC path. The specific instruction fetching process can be as follows:

[0029] 1) Obtain the physical address (ie, the instruction address) of the instruction to be read stored in the buffer;

[0030] 2) According to the currently obtained physical address, predict whether the instruction is read through the IC path (that is, enter the IC path instru...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the present invention discloses an instruction fetching method, device, electronic equipment, and storage medium, relates to the technical field of processors, and can overcome the disadvantages of low index fetching efficiency and high power consumption in the prior art. The method includes: predicting whether to switch to the OC path instruction fetch mode according to the fetch address of the IC path fetch mode this time; from the information collection device, searching for the following content associated with the fetch address: from the current prediction result corresponding to The confidence level of changing the instruction fetch mode to another instruction fetch mode; determine whether the current prediction result needs to be calibrated according to the search result; calibrate or keep the current prediction result according to the determined result, and enter the corresponding instruction fetch mode based on the prediction result to perform fetching refer to. The invention is an optimization method for the existing IC / OC path instruction fetching mode switching prediction technology, which is applicable to the scene of how to improve the instruction fetching efficiency of a processor and how to reduce the front-end power consumption of the processor pipeline.

Description

technical field [0001] The present invention relates to the technical field of processors, and in particular, to an instruction fetching method, apparatus, electronic device and storage medium. Background technique [0002] Currently, in processor design, an IC (Instruction Cache, instruction cache device) is a cache device for storing instruction data information. After the processor reads the instruction data from the IC, it will transfer the instruction data to the decoding module for instruction decoding, and then transmit the instruction data to the corresponding instruction execution unit through the decoding module. However, with the development of technology, a device design of Micro-Op Cache (OC, micro-instruction cache device) is proposed in some product designs, which is used as a new type of cache device to store data information after instruction decoding. Based on the above design, such as figure 1 As shown, after obtaining the physical address (ie, the instr...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F9/30
CPCG06F9/30047
Inventor 张克松
Owner CHENGDU HAIGUANG MICROELECTRONICS TECH CO LTD
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More