Publisher
Springer Nature Switzerland
Reference8 articles.
1. Lee, J., Seo, M., Hajishirzi, H., Kang, J.: Contextualized sparse representations for real-time open-domain question answering. In: Procedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 912–919. Association for Computational Linguistics (2020)
2. Li, C., Choi, J.D.: Transformers to learn hierarchical contexts in multiparty dialogue for span-based question answering. arXiv:2004.03561v2 (2020)
3. Baheti, A., Ritter, A., Small, K.: Fluent response generation for conversational question answering. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 191–207 (2020)
4. Liu, Y., Ott, M., Goyal, N., Du, J., Joshi, M.: RobERTa: a robustly optimized BERT pretraining approach. arXiv:1907.1169zv1 (2019)
5. Devlin, J., Chang, M., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. arXiv:1810.04805v2 (2019)