Author:
Yang DanPing,Li XianXian,Wu Hao,Zhou Aoxiang,Liu Peng
Publisher
Springer Nature Singapore
Reference34 articles.
1. Beltagy, I., Lo, K., Cohan, A.: SciBERT: a pretrained language model for scientific text. In: EMNLP-IJCNLP, pp. 3615–3620 (2019)
2. Chen, H., Chen, B., Zhou, X.: Did the models understand documents? Benchmarking models for language understanding in document-level relation extraction. In: ACL, pp. 6418–6435 (2023)
3. Choi, M., Lim, H., Choo, J.: Prism: enhancing low-resource document level relation extraction with relation-aware score calibration. arXiv preprint arXiv:2309.13869 (2023)
4. Christopoulou, F., Miwa, M., Ananiadou, S.: Connecting the dots: document-level neural relation extraction with edge-oriented graphs. In: EMNLP-IJCNLP, pp. 4925–4936 (2019)
5. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: NAACL, pp. 4171–4186 (2019)