Author:
Cossu Andrea,Graffieti Gabriele,Pellegrini Lorenzo,Maltoni Davide,Bacciu Davide,Carta Antonio,Lomonaco Vincenzo
Abstract
The ability of a model to learn continually can be empirically assessed in different continual learning scenarios. Each scenario defines the constraints and the opportunities of the learning environment. Here, we challenge the current trend in the continual learning literature to experiment mainly on class-incremental scenarios, where classes present in one experience are never revisited. We posit that an excessive focus on this setting may be limiting for future research on continual learning, since class-incremental scenarios artificially exacerbate catastrophic forgetting, at the expense of other important objectives like forward transfer and computational efficiency. In many real-world environments, in fact, repetition of previously encountered concepts occurs naturally and contributes to softening the disruption of previous knowledge. We advocate for a more in-depth study of alternative continual learning scenarios, in which repetition is integrated by design in the stream of incoming information. Starting from already existing proposals, we describe the advantages such class-incremental with repetition scenarios could offer for a more comprehensive assessment of continual learning models.
Reference22 articles.
1. “Online continual learning with maximal interfered retrieval,”;Aljundi
2. “Task-free continual learning,”;Aljundi
3. Continual learning for recurrent neural networks: an empirical evaluation;Cossu;Neural Netw,2021
4. Continual prototype evolution: learning online from non-stationary data streams;De Lange,2020
5. A continual learning survey: defying forgetting in classification tasks;De Lange,2021
Cited by
10 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献