1. Chen, T. Q., Rubanova, Y., Bettencourt, J., and Duvenaud, D. K. (2018). “Neural Ordinary Differential Equations.” In Neural Information Processing Systems.
2. Conneau, A., Kiela, D., Schwenk, H., Barrault, L., and Bordes, A. (2017). “Supervised Learning of Universal Sentence Representations from Natural Language Inference Data.” In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 670–680.
3. Devlin, J., Chang, M.-W., Lee, K., and Toutanova, K. (2019). “BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding.” In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (NAACL), pp. 4171–4186.
4. Ishiwatari, S., Hayashi, H., Yoshinaga, N., Neubig, G., Sato, S., Toyoda, M., and Kitsuregawa, M. (2019). “Learning to Describe Unknown Phrases with Local and Global Contexts.” In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (NAACL-HLT), pp. 3467–3476.
5. 小林颯介 (2022). 「訓練事例の影響の軽量な推定」の執筆. 自然言語処理, 29 (2), pp. 699–704. [S. Kobayashi (2022). Writing of ``Efficient Estimation of Influence of a Training Instance’'. Journal of Natural Language Processing, 29 (2), pp. 699–704.].