Author:
Ma Jipeng,Hou Hongxu,Zhao Yuan,Sun Shuo,Chen Wei,Shi Guodong
Publisher
Springer Nature Singapore
Reference34 articles.
1. Radford, A., et al.: Language models are unsupervised multitask learners. OpenAI Blog 1(8), 9 (2019)
2. Lewis, M., et al.: BART: denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. Association for Computational Linguistics (2020)
3. Liu, R., et al.: Knowledge infused decoding. In: International Conference on Learning Representations (2021)
4. Wang, X., Zhou, K., Wen, J.R., Zhao, W.X.: Towards unified conversational recommender systems via knowledge-enhanced prompt learning. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 1929–1937 (2022)
5. Wu, Z., Bi, W., Li, X., Kong, L., Kao, B.: Lexical knowledge internalization for neural dialog generation. In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 7945–7958 (2022)