Hyper-feature aggregation and relaxed distillation for class incremental learning
-
Published:2024-08
Issue:
Volume:152
Page:110440
-
ISSN:0031-3203
-
Container-title:Pattern Recognition
-
language:en
-
Short-container-title:Pattern Recognition
Author:
Wu RanORCID,
Liu Huanyu,
Yue ZongchengORCID,
Li Jun-Bao,
Sham Chiu-WingORCID
Reference32 articles.
1. Catastrophic forgetting in connectionist networks;French;Trends Cogn. Sci.,1999
2. Overcoming catastrophic forgetting in neural networks;Kirkpatrick;Proc. Natl. Acad. Sci.,2017
3. A. Chaudhry, P.K. Dokania, T. Ajanthan, P.H. Torr, Riemannian walk for incremental learning: Understanding forgetting and intransigence, in: Proceedings of the European Conference on Computer Vision, ECCV, 2018, pp. 532–547.
4. Overcoming catastrophic forgetting with hard attention to the task;Serra,2018
5. Exemplar-free class incremental learning via discriminative and comparable parallel one-class classifiers;Sun;Pattern Recognit.,2023