Author:
Liu Zhenzhen,Zhou Jin Peng,Weinberger Kilian Q.
Abstract
Out-of-distribution (OOD) detection is crucial for enhancing the reliability of machine learning models when confronted with data that differ from their training distribution. In the image domain, we hypothesize that images inhabit manifolds defined by latent properties such as color, position, and shape. Leveraging this intuition, we propose a novel approach to OOD detection using a diffusion model to discern images that deviate from the in-domain distribution. Our method involves training a diffusion model using in-domain images. At inference time, we lift an image from its original manifold using a masking process, and then apply a diffusion model to map it towards the in-domain manifold. We measure the distance between the original and mapped images, and identify those with a large distance as OOD. Our experiments encompass comprehensive evaluation across various datasets characterized by differences in color, semantics, and resolution. Our method demonstrates strong and consistent performance in detecting OOD images across the tested datasets, highlighting its effectiveness in handling images with diverse characteristics. Additionally, ablation studies confirm the significant contribution of each component in our framework to the overall performance.
Reference69 articles.
1. “Likelihood-free out-of-distribution detection with invertible generative models,”;Ahmadian;IJCAI,2021
2. “Restyle: a residual-based stylegan encoder via iterative refinement,”;Alaluf;Proceedings of the IEEE/CVF International Conference on Computer Vision,2021
3. “Model-agnostic out-of-distribution detection using combined statistical tests,”;Bergamin,2022
4. Classification-based anomaly detection for general data;Bergman;arXiv preprint arXiv:2005.02359,2020
5. “Adabins: depth estimation using adaptive bins,”;Bhat;Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition,2021