Abstract
AbstractBecause of the increasing complexities of systems and applications, the performance of many traditional HPC benchmarks, such as HPL or HPCG, no longer correlates strongly with the actual performance of real applications. To address the discrepancy between simple benchmarks and real applications, and to better understand the application performance of systems, some metrics use a set of either real applications or mini applications. In particular, the Sustained System Performance (SSP) metric Kramer et al. (The NERSC sustained system performance (SSP) metric. Tech Rep LBNL-58868, 2005), which indicates the expected throughput of different applications executing with different datasets, is widely used. Whereas such a metric should lead to direct insights on the actual performance of real applications, sometimes more effort is necessary to port and evaluate complex applications. In this study, to obtain the approximate performance of SSP representing real applications, without running real applications, we propose a metric called the Simplified Sustained System Performance (SSSP) metric, which is computed based on several benchmark scores and their respective weighting factors, and we construct a method evaluating the SSSP metric of a system. The weighting factors are obtained by minimizing the gap between the SSP and SSSP scores based on a small set of reference systems. We evaluated the applicability of the SSSP method using eight systems and demonstrated that our proposed SSSP metrics produce appropriate performance projections of the SSP metrics of these systems, even when we adopted a simple method for computing the weighting factors. Additionally, the robustness of our SSSP metric was confirmed via computation of the weighting factors based on a smaller set of reference systems and computation of the SSSP metrics of other systems.
Publisher
Springer Science and Business Media LLC
Subject
Hardware and Architecture,Information Systems,Theoretical Computer Science,Software
Reference39 articles.
1. ECP proxy applications. https://proxyapps.exascaleproject.org/
2. HPC challenge. http://www.hpcchallenge.org/
3. Hpcg benchmark. http://www.hpcg-benchmark.org/
4. Aaziz O, Cook J, Cook J, Juedeman T, Richards D, Vaugha C (2018)A methodology for characterizing the correspondence between real and proxy applications. In: Proceedings of 2018 IEEE International Conference on Cluster Computing (CSTER). IEEE
5. Armstrong B, Bae H, Eigenmann R, Saied F, Sayeed M, Zheng Y (2006) Hpc benchmarking and performance evaluation with realistic applications. In: The SPEC Benchmark Workshop, pp. 1–11
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献