A family of experiments about how developers perceive delayed system response time
-
Published:2024-03-04
Issue:2
Volume:32
Page:567-605
-
ISSN:0963-9314
-
Container-title:Software Quality Journal
-
language:en
-
Short-container-title:Software Qual J
Author:
Cornejo Oscar, Briola DanielaORCID, Micucci Daniela, Ginelli Davide, Mariani Leonardo, Santos Parrilla Adrián, Juristo Natalia
Abstract
AbstractCollecting and analyzing data about developers working on their development tasks can help improve development practices, finally increasing the productivity of teams. Indeed, monitoring and analysis tools have already been used to collect data from productivity tools. Monitoring inevitably consumes resources and, depending on their extensiveness, may significantly slow down software systems, interfering with developers’ activity. There is thus a challenging trade-off between monitoring and validating applications in their operational environment and preventing the degradation of the user experience. The lack of studies about when developers perceive an overhead introduced in an application makes it extremely difficult to fine-tune techniques working in the field. In this paper, we address this challenge by presenting an empirical study that quantifies how developers perceive overhead. The study consists of three replications of an experiment that involved 99 computer science students in total, followed by a small-scale experimental assessment of the key findings with 12 professional developers. Results show that non-negligible overhead can be introduced for a short period into applications without developers perceiving it and that the sequence in which complex operations are executed influences the perception of the system response time. This information can be exploited to design better monitoring techniques.
Funder
European Research Council Università degli Studi di Milano - Bicocca
Publisher
Springer Science and Business Media LLC
Reference62 articles.
1. Abbas, T., Gadiraju, U., Khan, V.-J., & Markopoulos, P. (2022). Understanding user perceptions of response delays in crowd-powered conversational systems. Proc. ACM Human-Computer Interactions, 6(CSCW2). 2. Arapakis, I., Park, S., & Pielot, M. (2021). Impact of response latency on user behaviour in mobile web search. In: Proceedings of the 2021 Conference on Human Information Interaction and Retrieval. CHIIR ’21, pp. 279–283. Association for Computing Machinery, New York, NY, USA. https://doi.org/10.1145/3406522.3446038 3. Arnold, M., Vechev, M., & Yahav, E. (2011). QVM: An efficient runtime for detecting defects in deployed systems. ACM Transactions on Software Engineering and Methodologies, 21(1), 2–1235. 4. Baresi, L., & Ghezzi, C. (2010). The disappearing boundary between development-time and run-time. In: Proceedings of the FSE/SDP Workshop on Future of Software Engineering Research (FoSER). 5. Bertolino, A., Braione, P., Angelis, G. D., Gazzola, L., Kifetew, F.M., Mariani, L., Orrù, M., Pezzè, M., Pietrantuono, R., Russo, S., & Tonella, P. (2022). A survey of field-based testing techniques. ACM Computing Surveys, 54(5).
|
|