Author:
Wang Shuangmei,Cao Yang,Wu Tieru
Abstract
AbstractFew-shot class-incremental learning (FSCIL) struggles to incrementally recognize novel classes from few examples without catastrophic forgetting of old classes or overfitting to new classes. We propose TLCE, which ensembles multiple pre-trained models to improve separation of novel and old classes. Specifically, we use episodic training to map images from old classes to quasi-orthogonal prototypes, which minimizes interference between old and new classes. Then, we incorporate the use of ensembling diverse pre-trained models to further tackle the challenge of data imbalance and enhance adaptation to novel classes. Extensive experiments on various datasets demonstrate that our transfer learning ensemble approach outperforms state-of-the-art FSCIL methods.
Publisher
Springer Science and Business Media LLC
Reference46 articles.
1. Hersche M, Karunaratne G, Cherubini G, Benini L, Sebastian A, Rahimi A (2022)Constrained few-shot class-incremental learning. In: IEEE/CVF conference on computer vision and pattern recognition, pp 9057–9067
2. Zhu K, Cao Y, Zhai W, Cheng J, Zha Z-J (2021) Self-promoted prototype refinement for few-shot class-incremental learning. In: IEEE/CVF conference on computer vision and pattern recognition, pp 6801–6810
3. Zhang C, Song N, Lin G, Zheng Y, Pan P, Xu Y (2021) Few-shot incremental learning with continually evolved classifiers. In: IEEE/CVF conference on computer vision and pattern recognition, pp 12455–12464
4. Shi G, Chen J, Zhang W, Zhan L-M, Wu X-M (2021) Overcoming catastrophic forgetting in incremental few-shot learning by finding flat minima. Adv Neural Inf Process Syst 34:6747–6761
5. Chen W-Y, Liu Y-C, Kira Z, Wang Y-CF, Huang J-B (2019) A Closer Look at Few-shot Classification. In: The international conference on learning representations