Abstract
AbstractThe need of replicating empirical studies in Computer Science is widely recognized among the research community. It is essential to report the changes of each replication to promote not only the comprehensibility of the evolution of the experimental validity across a family of studies, but also replicability itself. Unfortunately, the lack of proposals for systematic reporting of changes in replications undermines these desirable objectives. The main goal of the work presented in this article is to provide researchers in Computer Science with a systematic tool-supported approach for the specification and reporting of changes in the replications of their empirical studies. Applying Design Science Research, we have developed and validated a composite artifact consisting of (i) a metamodel that formalizes all the relevant concepts related to replications and their changes; (ii) templates and linguistic patterns that facilitate their reporting; and (iii) a proof-of-concept model-based software tool that supports the proposed approach. For its validation, we have carried out a multiple case study that includes 9 families of empirical studies not only from Computer Science, but also from an area as different as Agrobiology , to check the external validity of our approach. The 9 families encompass 23 replication studies and a total of 92 replication changes, for which we have analyzed the suitability of our proposal. The multiple case study revealed some initial limitations of our approach related to threats to experimental validity and context variables. After several improvement iterations on the artifact, all of the 92 replication changes could be properly specified, including also their qualitatively estimated effects on experimental validity and their corresponding visualization. Our proposal for the specification of replication changes seems to fit the needs not only of replications in Computer Science, but also in other research areas. Nevertheless, further research is needed to improve it and disseminate its use among the research community.
Publisher
Springer Science and Business Media LLC
Subject
Computational Mathematics,Computational Theory and Mathematics,Computer Science Applications,Numerical Analysis,Theoretical Computer Science,Software
Reference74 articles.
1. Albayrak Ö, Carver JC (2014) Investigation of individual factors impacting the effectiveness of requirements inspections: a replicated experiment. Empir Softw Eng 19(1):241–266. https://doi.org/10.1007/s10664-012-9221-0
2. Almqvist JPF (2006) Replication of controlled experiments in empirical software engineering - a survey. Master’s thesis, master’s thesis, Department of Computer Science, Faculty of Science, Lund University, Sweden
3. Aranda A (2016) Empirical study of the influence of analyst experience and domain knowledge on the effectiveness of requirements education. PhD thesis, Polytechnic University of Madrid, https://doi.org/10.20868/UPM.thesis.40566
4. Assar S, Borg M, Pfahl D (2016) Using text clustering to predict defect resolution time: a conceptual replication and an evaluation of prediction accuracy. Empir Softw Eng 21(4):1437–1475. https://doi.org/10.1007/s10664-015-9391-7
5. Association for Computing Machinery (2020) Artifact review and badging. https://www.acm.org/publications/policies/artifact-review-and-badging-current
Cited by
5 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献