Author:
Alekseev Aleksandr,Espinal Xavier,Jezequel Stephane,Kiryanov Andrey,Klimentov Alexei,Korchuganova Tatiana,Mitsyn Valeri,Oleynik Danila,Smirnov Alexander,Smirnov Sergei,Zarochentsev Andrey
Abstract
The High Luminosity phase of the LHC, which aims for a tenfold increase in the luminosity of proton-proton collisions is expected to start operation in eight years. An unprecedented scientific data volume at the multiexabyte scale will be delivered to particle physics experiments at CERN. This amount of data has to be stored and the corresponding technology must ensure fast and reliable data delivery for processing by the scientific community all over the world. The present LHC computing model will not be able to provide the required infrastructure growth even taking into account the expected hardware evolution. To address this challenge the Data Lake R&D project has been launched by the DOMA community in the fall of 2019. State-of-the-art data handling technologies are under active development, and their current status for the Russian Scientific Data Lake prototype is presented here.
Reference13 articles.
1. LHC Machine
2. Klimentov A., Exascale Data Processing in Heterogeneous Distributed Computing Infrastructure for Applications in High Energy Physics, Physics of Particles and Nuclei, 51(6), 995-1068, 10.1134/S1063779620060052
3. Klimentov A. et al, Russian scientific data lake, Open Science Platforms, 2018, vol. 4.
4. Alekseev A. et al., On the road to a scientific data lake for the High Luminosity LHC era, 2020, International Journal of Modern Physics A, Vol. 35, No. 33, 2030022
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献