1. Alt, C., Hübner, M., Hennig, L.: Improving relation extraction by pre-trained language representations. arXiv preprint arXiv:1906.03088 (2019)
2. Aydar, M., Bozal, O., Ozbay, F.: Neural relation extraction: a survey. arXiv preprint arXiv:2007.04247 (2020)
3. Chen, G., Tian, Y., Song, Y., Wan, X.: Relation extraction with type-aware map memories of word dependencies. In: Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021, pp. 2501–2512 (2021)
4. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018)
5. Distiawan, B., Weikum, G., Qi, J., Zhang, R.: Neural relation extraction for knowledge base enrichment. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pp. 229–240 (2019)