1. Li, J., Tang, T., Zhao, W.X., Nie, J.Y., and Wen, J.R. (2022). Pretrained language models for text generation: A survey. arXiv.
2. Dou, Z.Y., and Peng, N. (March, January 22). Zero-shot commonsense question answering with cloze translation and consistency optimization. Proceedings of the AAAI Conference on Artificial Intelligence, Virtual.
3. Guu, K., Lee, K., Tung, Z., Pasupat, P., and Chang, M. (2020, January 12–18). Retrieval augmented language model pre-training. Proceedings of the International Conference on Machine Learning, PMLR, Vienna, Austria.
4. Jin, X., Zhang, D., Zhu, H., Xiao, W., Li, S.W., Wei, X., Arnold, A., and Ren, X. (2021). Lifelong pretraining: Continually adapting language models to emerging corpora. arXiv.
5. Time-aware language models as temporal knowledge bases;Dhingra;Trans. Assoc. Comput. Linguist.,2022