Affiliation:
1. University of Chile, Chile
Abstract
Little is known about how software performance evolves across software revisions. The severity of this situation is high since (i) most performance variations seem to happen accidentally and (ii) addressing a performance regression is challenging, especially when functional code is stacked on it. This paper reports an empirical study on the performance evolution of 19 applications, totaling over 19 MLOC. It took 52 days to run our 49 benchmarks. By relating performance variation with source code revisions, we found out that: (i) 1 out of every 3 application revisions introduces a performance variation, (ii) performance variations may be classified into 9 patterns, (iii) the most prominent cause of performance regression involves loops and collections. We carefully describe the patterns we identified, and detail how we addressed the numerous challenges we faced to complete our experiment.
Publisher
Association for Computing Machinery (ACM)
Subject
Computer Graphics and Computer-Aided Design,Software
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Performance evolution of configurable software systems: an empirical study;Empirical Software Engineering;2023-11
2. A Test for FLOPs as a Discriminant for Linear Algebra Algorithms;2022 IEEE 34th International Symposium on Computer Architecture and High Performance Computing (SBAC-PAD);2022-11
3. How Software Refactoring Impacts Execution Time;ACM Transactions on Software Engineering and Methodology;2022-04-30
4. On the Effectiveness of Bisection in Performance Regression Localization;Empirical Software Engineering;2022-04-30