Affiliation:
1. Department of Biostatistics, Epidemiology and Informatics University of Pennsylvania Philadelphia Pennsylvania 19104 USA
2. School of Electrical and Computer Engineering Purdue University West Lafayette Indiana 47907 USA
3. Department of Biostatistics University of Michigan Ann Arbor Michigan 48109 USA
Abstract
Deep neural network (DNN) models have achieved state‐of‐the‐art predictive accuracy in a wide range of applications. However, it remains a challenging task to accurately quantify the uncertainty in DNN predictions, especially those of continuous outcomes. To this end, we propose the Bayesian deep noise neural network (B‐DeepNoise), which generalizes standard Bayesian DNNs by extending the random noise variable from the output layer to all hidden layers. Our model is capable of approximating highly complex predictive density functions and fully learn the possible random variation in the outcome variables. For posterior computation, we provide a closed‐form Gibbs sampling algorithm that circumvents tuning‐intensive Metropolis–Hastings methods. We establish a recursive representation of the predictive density and perform theoretical analysis on the predictive variance. Through extensive experiments, we demonstrate the superiority of B‐DeepNoise over existing methods in terms of density estimation and uncertainty quantification accuracy. A neuroimaging application is included to show our model's usefulness in scientific studies.
Funder
National Science Foundation
National Institutes of Health
Subject
Statistics, Probability and Uncertainty,Statistics and Probability