Author:
Levi Dan,Gispan Liran,Giladi Niv,Fetaya Ethan
Abstract
Predicting not only the target but also an accurate measure of uncertainty is important for many machine learning applications, and in particular, safety-critical ones. In this work, we study the calibration of uncertainty prediction for regression tasks which often arise in real-world systems. We show that the existing definition for the calibration of regression uncertainty has severe limitations in distinguishing informative from non-informative uncertainty predictions. We propose a new definition that escapes this caveat and an evaluation method using a simple histogram-based approach. Our method clusters examples with similar uncertainty prediction and compares the prediction with the empirical uncertainty on these examples. We also propose a simple, scaling-based calibration method that preforms as well as much more complex ones. We show results on both a synthetic, controlled problem and on the object detection bounding-box regression task using the COCO and KITTI datasets.
Subject
Electrical and Electronic Engineering,Biochemistry,Instrumentation,Atomic and Molecular Physics, and Optics,Analytical Chemistry
Reference21 articles.
1. Multiple hypothesis tracking for multiple target tracking
2. Uncertainty in Deep Learning;Gal;Ph.D. Thesis,2016
3. Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning;Gal;Proceedings of the 33rd International Conference on Machine Learning (ICML-16),2016
4. Simple and scalable predictive uncertainty estimation using deep ensembles;Lakshminarayanan,2017
5. Estimating the mean and variance of the target probability distribution
Cited by
45 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献