Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method and data processing system optimizing performance through reporting of thread-level hardware resource utilization

一种数据处理系统、硬件资源的技术,应用在数据处理领域,能够解决下降等问题

Inactive Publication Date: 2005-07-20
INT BUSINESS MASCH CORP
View PDF0 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Regrettably, the sustained performance of HPC systems has not improved at the rate of peak performance, and in fact, the ratio between sustained performance and peak performance, although currently very low (e.g., 1: 10), but still dropping

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and data processing system optimizing performance through reporting of thread-level hardware resource utilization
  • Method and data processing system optimizing performance through reporting of thread-level hardware resource utilization
  • Method and data processing system optimizing performance through reporting of thread-level hardware resource utilization

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0017] Referring now to the drawings and in particular to FIG. 1 there is illustrated a high level block diagram of a multiprocessor (MP) data processing system which provides improved performance optimization according to one embodiment of the present invention. As depicted, data processing system 8 includes a plurality (eg, 8, 16, 64, or more) of processing units 10 coupled for communication by a system interconnect 12 . Each processing unit 10 is a signal integrated circuit including interface logic 23 and one or more processor cores 14 .

[0018] As also shown in FIG. 1, the memory hierarchy of data processing system 8 includes one or more system memories 26, which form the lowest level of volatile data storage in the memory hierarchy; and one or more levels of cache memory, For example, on-chip Level 2 (L2) cache 22 is used for staging instructions and operational metadata from system memory 26 to processor core 14 . As will be appreciated by those skilled in the art, ea...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

According to a method of operating a data processing system, one or more monitoring parameter sets are established in a processing unit within the data processing system. The processing unit monitors, in hardware, execution of each of a plurality of schedulable software entities within the processing unit in accordance with a monitoring parameter set among the one or more monitoring parameter sets. The processing unit then reports to software executing in the data processing system utilization of hardware resources by each of the plurality of schedulable software entities. The hardware utilization information reported by the processing unit may be stored and utilized by software to schedulable execution of the schedulable software entities reported by the processing unit. The hardware utilization information may also be utilized to generate a classification of at least one executing schedulable software entity, which may be communicated to the processing unit to dynamically modify an allocation of hardware resources to the schedulable software entity.

Description

technical field [0001] The present invention relates generally to data processing, and in particular, the present invention relates to performance optimization in data processing systems. More specifically, the present invention relates to a data processing system and method in which hardware and software are coordinated to optimize thread processing. Background technique [0002] Currently, a number of trends affect the development of server-class and mainframe computer systems. In particular, the density of transistors in integrated circuits continues to increase according to Moore's Law, which, in its current formulation, asserts that the number of transistors per unit area on an integrated circuit doubles approximately every 18 months. In addition, processor frequency doubles approximately every two years. Moreover, the system scale (that is, the number of CPUs in the system) continues to increase, reaching dozens, hundreds, and even thousands of processors in some cas...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/30G06F9/46
CPCG06F2201/88G06F11/3466
Inventor 拉维·K·阿里米利兰德尔·C·斯旺伯格
Owner INT BUSINESS MASCH CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products