Abstract
The diversity of the scientific goals across HEP experiments necessitates unique bodies of software tailored for achieving particular physics results. The challenge, however, is to identify the software that must be unique, and the code that is unnecessarily duplicated, which results in wasted effort and inhibits code maintainability.
Fermilab has a history of supporting and developing software projects that are shared among HEP experiments. Fermilab’s scientific computing division currently expends effort in maintaining and developing the LArSoft toolkit, used by liquid argon TPC experiments, as well as the event-processing framework technologies used by LArSoft, CMS, DUNE, and the majority of Fermilab-hosted experiments. As computing needs for DUNE and the HL-LHC become clearer, the computing models are being rethought. Presented here are Fermilab’s plans for addressing the evolving software landscape as it relates to LArSoft and the data-processing frameworks, specifically as it relates to the DUNE experiment.
Reference9 articles.
1. Abi B. et al. [DUNE Collaboration], arXiv:1807.10334 [physics.ins-det].
2. The art framework
3. Physics Analysis Tools for the CMS Experiment at LHC
4. Albrecht J. et al. [HEP Software Foundation Collaboration], Comput. Softw. Big Sci. 3, no. 1, 7 (2019) [arXiv:1712.06982 [physics.comp-ph]].
5. LArSoft: toolkit for simulation, reconstruction and analysis of liquid argon TPC neutrino detectors