Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Chip instruction and data pushing device of embedded processor

A data push and instruction technology, applied in electrical digital data processing, instruments, memory systems, etc., can solve the problems of processor performance loss, reduced prefetch timeliness, increased data transmission traffic, etc., to eliminate Cache pollution and solve access problems. Conflict problems, the effect of reducing traffic

Inactive Publication Date: 2011-06-15
NORTHWESTERN POLYTECHNICAL UNIV
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this device causes the high-level storage system to perform too many access operations to the low-level storage system
In a Harvard-structured storage system, this problem will lead to conflicts between too many first-level instruction and data caches accessing the second-level mixed cache at the same time, reducing the timeliness of data prefetch and increasing the traffic of data transmission on the bus
The reduction of prefetching timeliness leads to the loss of processor performance. When this loss is serious, it may offset the performance improvement brought by prefetching.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Chip instruction and data pushing device of embedded processor
  • Chip instruction and data pushing device of embedded processor
  • Chip instruction and data pushing device of embedded processor

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0015] refer to Figure 1~4 , the present invention adopts the pushing device of instruction and data, comprises instruction unit, Load / Store unit, one-level instruction Cache, one-level data Cache, missing queue, missing queue / write-back queue, two-level mixed Cache, push address preservation and A generation unit, a push timing control unit, an instruction push buffer and a data push buffer, the push address storage and generation unit includes an instruction push address register, a data forward push address register and a data reverse push address register. The push address storage and generation unit is used to store and calculate the address of the next instruction or data that needs to be pushed.

[0016] After adding these devices, the flow of the entire signal is as follows:

[0017] When the instruction unit needs to read the instruction, it sends the address to the instruction push buffer and the first-level instruction cache at the same time. These two devices re...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a chip instruction and data pushing device of an embedded processor, which is used for solving the technical problem of high data flow in a bus of the prior art. The invention adopts the technical scheme that a pushing address storing and generating unit and a pushing time controlling unit are additionally arranged on a two-level mixed Cache terminal; an instruction pushing Buffer is additionally arranged on a one level instruction Cache terminal; and a data pushing Buffer is additionally arranged on a one-level data Cache terminal. Because of the pushing address storing and generating unit and the pushing time controlling unit, the instruction and data scheduling initiative is transferred from a high-level memory system to a low-level memory system; and because the low-level memory system can effectively dispatch accesses, the problem of access conflict of the one-level Cache is solved, and the one-level Cache is not required to send a large amount of access requests, and the data flow on a bus is reduced. The increase of the instruction pushing Buffer and the data pushing Buffer solves the problem of Cache pollution caused by pushing.

Description

technical field [0001] The invention relates to an instruction and data pushing device, in particular to an instruction and data pushing device in an embedded processor chip. Background technique [0002] Document 1 "US Patent No. 5778423" discloses an instruction prefetching device, which prefetches new instructions by predicting branch targets, which can improve performance. However, this method will increase the number of requests from the upper-level storage system to the lower-level storage system, increase the data flow, and be affected by branch misprediction. The patent also mentions that unused bits in standard-length instructions can be used to mark addresses that may be jumped to. This method requires the modification of the format of some instructions, which requires not only hardware support but also software support. support. [0003] Document 2 "US Patent No. 7246205B2" discloses a device for pushing Cache, which can determine when to use the push Cache oper...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F12/08G06F12/0897
Inventor 高德远郑乔石田杭沛樊晓桠张盛兵王党辉魏廷存黄小平张萌郑然
Owner NORTHWESTERN POLYTECHNICAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products