1. Bahdanau, D., Cho, K., & Bengio, Y. (2016). Neural machine translation by jointly learning to align and translate. arXiv:1409.0473.
2. Chen, Q., Allot, A., & Lu, Z. (2020). Keep up with the latest coronavirus research. Nature, 579(7798), 193. https://doi.org/10.10.1038/d41586-020-00694-1, https://www.ncbi.nlm.nih.gov/pubmed/32157233
3. Colic, N., Furrer, L., & Rinaldi, F. (2020). Annotating the pandemic: Named entity recognition and normalisation in COVID-19 literature. In Proceedings of the 1st workshop on NLP for COVID-19 (Part 2) at EMNLP. https://doi.org/10.10.18653/v1/2020.nlpcovid19-2.27, https://aclanthology.org/2020.nlpcovid19-2.27
4. Conneau, A., Khandelwal, K., Goyal, N., Chaudhary, V., Wenzek, G., Guzmán, F., Grave, E., Ott, M., Zettlemoyer, L., & Stoyanov, V. (2020). Unsupervised cross-lingual representation learning at scale. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, Association for Computational Linguistics (pp 8440–8451). Online. https://doi.org/10.10.18653/v1/2020.acl-main.747, https://aclanthology.org/2020.acl-main.747
5. Devlin, J., Chang, M. W., Lee, K., &Toutanova, K. (2019). BERT: pre-training of deep bidirectional transformers for language understanding. In Proceedings of the 2019 conference of the North American chapter of the association for computational linguistics: human language technologies (Vol. 1, pp. 4171–4186). Long and Short Papers, Association for Computational Linguistics. https://doi.org/10.10.18653/v1/N19-1423, https://aclanthology.org/N19-1423