Author:
Yang Manzhi,Zhang Huaping,Yu Chenxi,Geng Guotong
Publisher
Springer Nature Singapore
Reference23 articles.
1. Aharoni, R., Goldberg, Y.: Unsupervised domain clusters in pretrained language models. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, vol. 1: Long Papers. Association for Computational Linguistics (2020). https://arxiv.org/abs/2004.02105
2. Aljundi, R., Babiloni, F., Elhoseiny, M., Rohrbach, M., Tuytelaars, T.: Memory aware synapses: learning what (not) to forget. In: Proceedings of the European Conference on Computer Vision (ECCV) (2018)
3. Berard, A.: Continual learning in multilingual NMT via language-specific embeddings. In: Proceedings of the Sixth Conference on Machine Translation, pp. 542–565. Association for Computational Linguistics (2021). https://aclanthology.org/2021.wmt-1.62
4. Cao, Y., Wei, H.R., Chen, B., Wan, X.: Continual learning for neural machine translation. In: Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 3964–3974 (2021)
5. Dabre, R., Fujita, A.: Combining sequence distillation and transfer learning for efficient low-resource neural machine translation models. In: Proceedings of the Fifth Conference on Machine Translation, pp. 492–502. Association for Computational Linguistics (2020). https://aclanthology.org/2020.wmt-1.61