Affiliation:
1. Department of Mathematics and Statistics Lancaster University Lancashire UK
Abstract
AbstractSpike‐and‐slab and horseshoe regressions are arguably the most popular Bayesian variable selection approaches for linear regression models. However, their performance can deteriorate if outliers and heteroskedasticity are present in the data, which are common features in many real‐world statistics and machine learning applications. This work proposes a Bayesian nonparametric approach to linear regression that performs variable selection while accounting for outliers and heteroskedasticity. Our proposed model is an instance of a Dirichlet process scale mixture model with the advantage that we can derive the full conditional distributions of all parameters in closed‐form, hence producing an efficient Gibbs sampler for posterior inference. Moreover, we present how to extend the model to account for heavy‐tailed response variables. The model's performance is tested against competing algorithms on synthetic and real‐world datasets.
Funder
Engineering and Physical Sciences Research Council
Reference20 articles.
1. Exchangeability and related topics
2. Fast sampling with Gaussian scale mixture priors in high‐dimensional regression;Bhattacharya A.;Biometrika,2016
3. Statistical Modeling: The Two Cultures (with comments and a rejoinder by the author)
4. Stan: A Probabilistic Programming Language
5. Carvalho C. M. Polson N. G. &Scott J. G.(2009).Handling sparsity via the horseshoe. InArtificial Intelligence and Statistics PMLR pp.73–80.