Supercharge Your Innovation With Domain-Expert AI Agents!

Adaptive scheduling with dynamic partition load balancing for fast partition compilation

A technology of load averaging and partitioning, which is applied in code compilation, program code conversion, instrumentation, etc., and can solve problems such as stagnation of compilation processing, inconspicuousness, and inefficient use of computing resources

Pending Publication Date: 2022-03-01
SYNOPSYS INC
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

While using multiple partitions allows tools to manage complex and parallelized workloads, for designs that can be relatively large and particularly complex, the resulting resource usage and overall performance are often not optimal
[0005] Therefore, attempts to parallelize workloads, even with distributed cloud-based systems, can result in inefficient use of computing resources, adversely affecting other processing using the same pool of computing resources
This impact can include a slowdown in overall compilation time, or even a deadlock that can stall the compilation process
However, it is often not obvious to the user of the tool how to allocate computing resources to complex compilation jobs to improve the efficiency and speed of compilation
[0006] User attempts to improve performance by changing compilation parameters may randomly affect performance metrics, or may cause further performance degradation, further deteriorating performance
Similarly, traditional automation technologies lack sufficient maturity, and they produce similarly problematic results

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Adaptive scheduling with dynamic partition load balancing for fast partition compilation
  • Adaptive scheduling with dynamic partition load balancing for fast partition compilation
  • Adaptive scheduling with dynamic partition load balancing for fast partition compilation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0025] Aspects of this disclosure relate to adaptive scheduling with fast partition compilation. A further aspect of the disclosure relates to dynamic partition load balancing for fast partition compilation.

[0026] As mentioned above, a hardware design specification or specification can be divided into multiple parts, which will not be compiled separately. Portions of a hardware design specification or description may be referred to as partitions. Using multiple partitions allows EDA tools to manage complex and parallelized workloads. When using traditional EDA tools to manage complex and parallelized workloads, the resulting resource usage and overall performance can be problematic for designs that can be relatively large and particularly complex. Additionally, in the presence of time constraints and / or resource constraints, specific compilations for large or complex designs may not complete in time, delaying schedules and delaying project delivery.

[0027] Various aspe...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Methods, systems, and computer-readable storage medium embodiments are disclosed herein for adaptive scheduling with dynamic partition load balancing for fast partition compilation. One embodiment includes detecting, by at least one processor, an amount of available hardware resources usable by an electronic design automation (EDA) process for a design description via a plurality of computing elements, and analyzing the design description to generate an estimate of the amount of hardware resources to be used by the EDA process. In some further embodiments, the at least one processor may compare the estimate to the amount of available hardware resources and adjust a memory allocation for the EDA process or a specified number of computing elements of the plurality of computing elements used in parallel by the EDA process. Further, according to some additional embodiments, the at least one processor may calculate a weighted load average for the plurality of computing elements.

Description

[0001] Cross References to Related Applications [0002] This application claims the benefit of U.S. Provisional Patent Application No. 63 / 072,401, entitled "Adaptive Scheduling with Dynamic Load Balancing for Fast Partition Compilation," filed August 31, 2020, which is incorporated herein by reference in its entirety. technical field [0003] The present invention generally relates to improving the performance of electronic design automation (EDA) tools. More specifically, the present invention relates to improving the efficiency and speed of partition compilation according to design specifications. Background technique [0004] Modern EDA tools may allow multiple partitions to be used to compile a hardware design specification or description (eg, from a higher level hardware description language). While using multiple partitions allows tools to manage complex and parallelized workloads, for designs that can be relatively large and particularly complex, the resulting reso...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F8/41
CPCG06F8/41G06F9/5044G06F2209/503G06F9/505G06F9/5016G06F9/4881
Inventor A·库玛
Owner SYNOPSYS INC
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More