Abstract
With the advancement and widespread adoption of deep learning models, there has been a growing interest in class incremental learning. This approach aims to continuously learn new classes while retaining the recognition and memory capabilities for previously learned classes within an open and dynamic environment. The primary focus of class incremental learning is on maintaining the ability to learn new classes while mitigating catastrophic forgetting, thus achieving a better balance between stability and adaptability. To address this challenge, we propose an innovative method for incremental class learning that leverages dynamically representations to facilitate more efficient incremental class learning, preserving previously acquired features while adapting to new ones and effectively reducing catastrophic forgetting. Furthermore, we introduce a feature augmentation mechanism to significantly enhance the model's classification performance when incorporating new classes. This approach ensures efficient learning of both old and new classes without compromising the effectiveness of previous models. We conducted extensive experiments on two classes incremental learning benchmarks, consistently demonstrating significant performance advantages over other methods.
Publisher
Darcy & Roy Press Co. Ltd.
Reference22 articles.
1. Xu M, Guo L Z, “Learning from group supervision: The impact of supervision deficiency on multi-label learning,” Science China Information Sciences,2021, vol. 64, pp.1–13.
2. Lippi M, Montemurro M A, Degli Esposti M, et al. Natural language statistical features of LSTM-generated texts. lEEE Transactions on Neural Networks and Learning Systems, 2019, pp.3326-3337.
3. Rebuffi S A, Kolesnikov A, Sperl G, Lampert C H. iCaRL: Incremental classifier and representation learning. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR). Honolulu, USA: IEEE, 2017, pp.5533−5542.
4. Kirkpatrick J, Pascanu R, Rabinowitz N, Veness J,Desjardins G, Rusu A A, et al. Overcoming catastrophic forgetting in neural networks. Proceedings of the National Academy of Sciences of the United States of America, 2017, vol. 144, no. 13, pp.3521−3526 .
5. Zenke F, Poole B, Ganguli S. Continual learning through synaptic intelligence. Proceedings of the 34th International Conference on Machine Learning (ICML).Sydney, Australia:PMLR, 2017,pp.3987−3995.