Affiliation:
1. Cognitive Computing Lab, Baidu Research USA
Abstract
Recent neural network models have achieved state-of-the-art performance on the task of named entity recognition (NER).
However, previous neural network models typically treat the input sentences as a linear sequence of words but ignore rich structural information, such as the coreference relations among non-adjacent words, phrases or entities. In this paper, we propose a novel approach to learn coreference-aware word representations for the NER task at the document level. In particular, we enrich the well-known neural architecture ``CNN-BiLSTM-CRF'' with a coreference layer on top of the BiLSTM layer to incorporate coreferential relations. Furthermore, we introduce the coreference regularization to ensure the coreferential entities to share similar representations and consistent predictions within the same coreference cluster. Our proposed model achieves new state-of-the-art performance on two NER benchmarks: CoNLL-2003 and OntoNotes v5.0. More importantly, we demonstrate that our framework does not rely on gold coreference knowledge, and can still work well even when the coreferential relations are generated by a third-party toolkit.
Publisher
International Joint Conferences on Artificial Intelligence Organization
Cited by
9 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. A Multi-task Learning Model for Gold-two-mention Co-reference Resolution;2023 International Joint Conference on Neural Networks (IJCNN);2023-06-18
2. A brief survey on recent advances in coreference resolution;Artificial Intelligence Review;2023-05-26
3. Exploring developments of the AI field from the perspective of methods, datasets, and metrics;Information Processing & Management;2023-03
4. Named Entity Recognition via Interlayer Attention Residual LSTM;2022 International Joint Conference on Neural Networks (IJCNN);2022-07-18
5. End-to-end Distantly Supervised Information Extraction with Retrieval Augmentation;Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval;2022-07-06