Affiliation:
1. Institute of Automotive Engineers Hubei University of Automotive Technology Shiyan Hubei China
Abstract
AbstractUnsupervised anomaly detection, often approached as a one‐class classification problem, is a critical task in computer vision. Knowledge distillation has emerged as a promising technique for enhancing anomaly detection accuracy, especially with the advent of reverse distillation networks that employ encoder–decoder architectures. This study introduces a novel reverse knowledge distillation framework known as RDMS, which incorporates a pretrained teacher encoding module, a multi‐level feature fusion connection module, and a student decoding module consisting of three independent decoders. RDMS is designed to distill distinct features from the teacher encoder, mitigating overfitting issues associated with similar or identical teacher–student structures. The model achieves an average of 99.3 image‐level AUROC and 98.34 pixel‐level AUROC on the MVTec‐AD dataset and demonstrates state‐of‐the‐art performance on the more challenging BTAD dataset. The RDMS model's high accuracy in anomaly detection and localization underscores the potential of multi‐student reverse distillation to advance unsupervised anomaly detection capabilities. The source code is available at https://github.com/zihengchen777/RDMS
Funder
Natural Science Foundation of Hubei Province
Publisher
Institution of Engineering and Technology (IET)
Reference48 articles.
1. Ruff L. Vandermeulen R. Goernitz N. Deecke L. Siddiqui S.A. Binder A. Müller E. Kloft M.:Deep one‐class classification. In:International conference on machine learning pp.4393–4402.Microtome Publishing Brookline MA(2018)
2. An End-to-End Steel Surface Defect Detection Approach via Fusing Multiple Hierarchical Features
3. Anomaly-GAN: A data augmentation method for train surface anomaly detection
4. Dim2Clear network for infrared small target detection;Zhang M.;IEEE Trans. Geosci. Remote Sens.,2023
5. Facial Expression Recognition Using a Semantic-Based Bottleneck Attention Module