Coreference Resolution Based on High-Dimensional Multi-Scale Information
Author:
Wang Yu12, Ding Zenghui1, Wang Tao1ORCID, Xu Shu12, Yang Xianjun1, Sun Yining1
Affiliation:
1. Hefei Institutes of Physical Science, Chinese Academy of Sciences, Hefei 230031, China 2. Science Island Branch, Graduate School of USTC (University of Science and Technology of China), Hefei 230026, China
Abstract
Coreference resolution is a key task in Natural Language Processing. It is difficult to evaluate the similarity of long-span texts, which makes text-level encoding somewhat challenging. This paper first compares the impact of commonly used methods to improve the global information collection ability of the model on the BERT encoding performance. Based on this, a multi-scale context information module is designed to improve the applicability of the BERT encoding model under different text spans. In addition, improving linear separability through dimension expansion. Finally, cross-entropy loss is used as the loss function. After adding BERT and span BERT to the module designed in this article, F1 increased by 0.5% and 0.2%, respectively.
Funder
Anhui Provincial Major Science and Technology Project National Key Research and Development Program of China
Reference37 articles.
1. Zeldes, A. (2021). Can we Fix the Scope for Coreference? Problems and Solutions for Benchmarks beyond OntoNotes. arXiv. 2. Brack, A., Müller, D.U., Hoppe, A., and Ewerth, R. (April, January 28). Coreference resolution in research papers from multiple domains. Proceedings of the European Conference on Information Retrieval, Virtual Event. 3. Xu, L., and Choi, J.D. (2022). Modeling task interactions in document-level joint entity and relation extraction. arXiv. 4. Ye, D., Lin, Y., Li, P., and Sun, M. (2021). Packed levitated marker for entity and relation extraction. arXiv. 5. Li, X., Yin, F., Sun, Z., Li, X., Yuan, A., Chai, D., Zhou, M., and Li, J. (2019). Entity-relation extraction as multi-turn question answering. arXiv.
|
|