Publisher
Springer Nature Singapore
Reference12 articles.
1. Choi, E., et al.: QuAC: question answering in context. In: EMNLP (2018)
2. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: NAACL-HLT (2019)
3. Joshi, M., Choi, E., Weld, D.S., Zettlemoyer, L.: TriviaQA: a large scale distantly supervised challenge dataset for reading comprehension. In: ACL (2017)
4. Kassner, N., Schütze, H.: Negated and misprimed probes for pretrained language models: birds can talk, but cannot fly. In: ACL (2020)
5. Khandelwal, A., Sawant, S.: NegBERT: a transfer learning approach for negation detection and scope resolution. In: LREC (2020)