1. Dinan, E., Roller, S., Shuster, K., Fan, A., Auli, M., Weston, J.: Wizard of Wikipedia: knowledge-powered conversational agents. In: ICLR (2019)
2. Fu, T., Zhao, X., Tao, C., Wen, J., Yan, R.: There are a thousand hamlets in a thousand people’s eyes: enhancing knowledge-grounded dialogue with personal memory. In: ACL (2022)
3. Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. In: ICLR (2015)
4. Lewis, M., et al.: BART: denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension. In: ACL (2020)
5. Liu, Y., et al.: RoBERTa: a robustly optimized BERT pretraining approach. CoRR abs/1907.11692 (2019)