Incremental Learning for Online Data Using QR Factorization on Convolutional Neural Networks
Author:
Kim Jonghong1, Lee WonHee12ORCID, Baek Sungdae3, Hong Jeong-Ho124ORCID, Lee Minho3
Affiliation:
1. Department of Neurology, Keimyung University Dongsan Hospital, Keimyung University School of Medicine, Daegu 42601, Republic of Korea 2. Department of Medical Informatics, Keimyung University School of Medicine, Daegu 42601, Republic of Korea 3. Graduate School of Artificial Intelligence, Kyungpook National University, Daegu 41566, Republic of Korea 4. Biolink Inc., Daegu 42601, Republic of Korea
Abstract
Catastrophic forgetting, which means a rapid forgetting of learned representations while learning new data/samples, is one of the main problems of deep neural networks. In this paper, we propose a novel incremental learning framework that can address the forgetting problem by learning new incoming data in an online manner. We develop a new incremental learning framework that can learn extra data or new classes with less catastrophic forgetting. We adopt the hippocampal memory process to the deep neural networks by defining the effective maximum of neural activation and its boundary to represent a feature distribution. In addition, we incorporate incremental QR factorization into the deep neural networks to learn new data with both existing labels and new labels with less forgetting. The QR factorization can provide the accurate subspace prior, and incremental QR factorization can reasonably express the collaboration between new data with both existing classes and new class with less forgetting. In our framework, a set of appropriate features (i.e., nodes) provides improved representation for each class. We apply our method to the convolutional neural network (CNN) for learning Cifar-100 and Cifar-10 datasets. The experimental results show that the proposed method efficiently alleviates the stability and plasticity dilemma in the deep neural networks by providing the performance stability of a trained network while effectively learning unseen data and additional new classes.
Funder
Ministry of Health & Welfare, Republic of Korea
Subject
Electrical and Electronic Engineering,Biochemistry,Instrumentation,Atomic and Molecular Physics, and Optics,Analytical Chemistry
Reference40 articles.
1. Hung, C.Y., Tu, C.H., Wu, C.E., Chen, C.H., Chan, Y.M., and Chen, C.S. (2019, January 8–14). Compacting, Picking and Growing for Unforgetting Continual Learning. Proceedings of the Advances in Neural Information Processing Systems, Vancouver, BC, Canada. 2. Learning without forgetting;Li;IEEE Trans. Pattern Anal. Mach. Intell.,2017 3. Wu, Y., Chen, Y., Wang, L., Ye, Y., Liu, Z., Guo, Y., Zhang, Z., and Fu, Y. (2018). Incremental classifier learning with generative adversarial networks. arXiv. 4. Wu, Y., Chen, Y., Wang, L., Ye, Y., Liu, Z., Guo, Y., and Fu, Y. (2019, January 16–20). Large scale incremental learning. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Long Beach, CA, USA. 5. Rebuffi, S.A., Kolesnikov, A., Sperl, G., and Lampert, C.H. (2017, January 21–26). icarl: Incremental classifier and representation learning. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA.
|
|