Abstract
Abstract
We present three different methods to estimate error bars on the predictions made using a neural network (NN). All of them represent lower bounds for the extrapolation errors. At first, we illustrate the methods through a simple toy model, then, we apply them to some realistic case related to nuclear masses. By using theoretical data simulated either with a liquid-drop model or a Skyrme energy density functional, we benchmark the extrapolation performance of the NN in regions of the Segrè chart far away from the ones used for the training and validation. Finally, we discuss how error bars can help identifying when the extrapolation becomes too uncertain and thus not reliable.
Funder
Science and Technology Facilities Council
Subject
Nuclear and High Energy Physics
Cited by
10 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献