Author:
Li Wen,Xie Yi,Jiang Hailan,Sun Yuqing
Publisher
Springer Nature Singapore
Reference26 articles.
1. Kreutz, C.K., Schenkel, R.: Scientific paper recommendation systems: a literature review of recent publications. Int. J. Digit. Libr. 23(4), 335–369 (2022)
2. Lewis, M., Liu, Y., et al.: BART: denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension. In: Annual Meeting of the Association for Computational Linguistics, pp. 7871–7880 (2020)
3. Radford, A., Wu, J., et al.: Language models are unsupervised multitask learners. OpenAI blog 1(8), 9 (2019)
4. Devlin, J., et al.: BERT: pre-training of deep bidirectional transformers for language understanding. In: NAACL-HLT, pp. 4171–4186 (2019)
5. Radford, A., Narasimhan, K., et al.: Improving language understanding by generative pre-training (2018)