Abstract
AbstractExtractive approaches have been the mainstream paradigm for identifying overlapping entity–relation extraction. However, limited by their inherently methodological flaws, which hardly deal with three issues: hierarchical dependent entity–relations, implicit entity–relations, and entity normalization. Recent advances have proposed an effective solution based on generative language models, which cast entity–relation extraction as a sequence-to-sequence text generation task. Inspired by the observation that humans learn by getting to the bottom of things, we propose a novel framework, namely GenRE, Generative multi-turn question answering with contrastive learning for entity–relation extraction. Specifically, a template-based question prompt generation first is designed to answer in different turns. We then formulate entity–relation extraction as a generative question answering task based on the general language model instead of span-based machine reading comprehension. Meanwhile, the contrastive learning strategy in fine-tuning is introduced to add negative samples to mitigate the exposure bias inherent in generative models. Our extensive experiments demonstrate that GenRE performs competitively on two public datasets and a custom dataset, highlighting its superiority in entity normalization and implicit entity–relation extraction. (The code is available at https://github.com/lovelyllwang/GenRE).
Funder
National Natural Science Foundation of China
Natural Science Foundation of Xinjiang Province
Natural Science Foundation of Hebei Province
Publisher
Springer Science and Business Media LLC
Reference55 articles.
1. Fader A, Zettlemoyer L, Etzioni O (2014) Open question answering over curated and extracted knowledge bases. In: Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data, pp 1156–1165. https://doi.org/10.1145/2623330.2623677
2. Gupta V, Lehal GS (2010) A survey of text summarization extractive techniques. J Emerg Technol Web Intell 2:258–268. https://doi.org/10.4304/jetwi.2.3.258-268
3. Riedel S, Yao L, McCallum A, Marlin BM (2013) Relation extraction with matrix factorization and universal schemas. In: Proceedings of the 2013 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp 74–84. https://aclanthology.org/N13-1008
4. Chan YS, Roth D (2011) Exploiting syntactico-semantic structures for relation extraction. In: Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies, pp 551–560. https://aclanthology.org/P11-1056
5. Lin Y, Shen S, Liu Z, Luan H, Sun M (2016) Neural relation extraction with selective attention over instances. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, vol 1, pp 2124–2133. https://doi.org/10.18653/v1/p16-1200