Supercharge Your Innovation With Domain-Expert AI Agents!

A Data Flow Graph Congestion Detection Method Based on Execution Efficiency Gradient Prediction

A data flow graph and execution efficiency technology, which is applied in the direction of processing data according to predetermined rules, can solve problems such as large network bandwidth requirements, fierce competition for accelerated resources, and increased bandwidth requirements, and achieve the effect of data flow graph optimization

Active Publication Date: 2020-07-28
北京中科通量科技有限公司
View PDF7 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The disadvantage of this method is that there are too many nodes to be detected by special nodes, and the demand for network bandwidth is large.
In this way, the acceleration method can be fully utilized, so that the acceleration resources can be allocated reasonably; if only the congestion of itself is detected, it is likely to cause excessive competition for acceleration resources, increase bandwidth requirements, and reduce execution efficiency

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Data Flow Graph Congestion Detection Method Based on Execution Efficiency Gradient Prediction
  • A Data Flow Graph Congestion Detection Method Based on Execution Efficiency Gradient Prediction
  • A Data Flow Graph Congestion Detection Method Based on Execution Efficiency Gradient Prediction

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0041] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0042] The difference between the network on chip and the data flow architecture is that the nodes on the data flow graph in the data flow architecture (each node represents a program instruction) must execute the input data, and the data will be forwarded to other nodes after execution; The nodes (routes) of the on-chip network do not necessarily execute the received data, and may directly forward it to the next node route.

[0043]The mapping process of the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention discloses a data flow diagram congestion detection method based on execution efficiency gradient prediction. The method comprises: setting a timer and a instruction counter at each node in a data flow diagram separately; setting a management node at the data flow diagram; setting a first information record table at the management node, wherein an ID, an execution rate v, an execution rate change rate s, a predicted execution rate vn and a queue mark k of each node are recorded in the first information record table; setting a second information record table at each node; performing congestion detection on each node and calculating the execution rate v, the execution rate change rate s and the predicted execution rate vn of the node; sending the detected execution rate v, execution rate change rate s and the predicted execution rate vn and the ID and queue mark k of the node to the management node so as for the management node to update the first information record table and send congestion information of an adjacent node to the node and for the node to update the second information record table accordingly.

Description

technical field [0001] The invention relates to the technical field of data flow architecture, in particular to a data flow graph congestion detection method based on execution efficiency gradient prediction. Background technique [0002] With the advent of the era of big data, the amount and scale of data that computers need to process is increasing. The data flow architecture has attracted people's attention due to its high instruction parallelism and low memory access frequency. In a dataflow architecture, program code can be transformed into a dataflow graph. A data flow graph is a directed graph in which each node represents a piece of code in a program and directed lines represent the flow of data between nodes. In the data flow graph, the execution rates of different nodes are different, which will cause the congestion of the data flow graph. In order to alleviate the congestion of the data flow graph, methods such as multi-context and key node replication can be us...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F7/78
CPCG06F7/78
Inventor 欧焱张浩范东睿谭旭马丽娜
Owner 北京中科通量科技有限公司
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More