Abstract
AbstractThis paper provides both an introduction to and a detailed overview of the principles and practice of classifier calibration. A well-calibrated classifier correctly quantifies the level of uncertainty or confidence associated with its instance-wise predictions. This is essential for critical applications, optimal decision making, cost-sensitive classification, and for some types of context change. Calibration research has a rich history which predates the birth of machine learning as an academic field by decades. However, a recent increase in the interest on calibration has led to new methods and the extension from binary to the multiclass setting. The space of options and issues to consider is large, and navigating it requires the right set of concepts and tools. We provide both introductory material and up-to-date technical details of the main concepts and methods, including proper scoring rules and other evaluation metrics, visualisation approaches, a comprehensive account of post-hoc calibration methods for binary and multiclass classification, and several advanced topics.
Funder
Engineering and Physical Sciences Research Council
Alan Turing Institute
Estonian Research Competency Council
Publisher
Springer Science and Business Media LLC
Subject
Artificial Intelligence,Software
Reference88 articles.
1. Allikivi, M.-L., & Kull, M. (2019). Non-parametric bayesian isotonic calibration: Fighting over-confidence in binary classification. In Joint European Conference on Machine Learning and Knowledge Discovery in Databases (ECML-PKDD’19), (pp. 68–85).
2. Angelopoulos, A. N. & Bates, S. (2021). A gentle introduction to conformal prediction and distribution-free uncertainty quantification. arXiv preprintarXiv:2107.07511.
3. Ayer, M., Brunk, H. D., Ewing, G. M., Reid, W. T., & Silverman, E. (1955). An empirical distribution function for sampling with incomplete information. The Annals of Mathematical Statistics, 26, 641–647.
4. Barlow, R. E., & Brunk, H. D. (1972). The isotonic regression problem and its dual. Journal of the American Statistical Association, 67(337), 140–147.
5. Bengs, V., Hüllermeier, E., & Waegeman, W. (2022). Pitfalls of epistemic uncertainty quantification through loss minimisation. In A. H. Oh, A. Agarwal, D. Belgrave, & K. Cho (eds.), Advances in Neural Information Processing Systems.
Cited by
25 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献