Abstract
With the development of information extraction technology, a variety of entity-relation extraction paradigms have been formed. However, approaches guided by these existing paradigms suffer from insufficient information fusion and too coarse extraction granularity, leading to difficulties extracting all triples in a sentence. Moreover, the joint entity-relation extraction model cannot easily adapt to the relation extraction task. Therefore, we need to design more fine-grained and flexible extraction methods. In this paper, we propose a new extraction paradigm based on existing paradigms. Then, based on it, we propose SSPC, a method for Span-based Fine-Grained Entity-Relation Extraction via Sub-Prompts Combination. SSPC first decomposes the task into three sub-tasks, namely S,R Extraction, R,O Extraction and S,R,O Classification and then uses prompt tuning to fully integrate entity and relation information in each part. This fine-grained extraction framework makes the model easier to adapt to other similar tasks. We conduct experiments on joint entity-relation extraction and relation extraction, respectively. The experimental results show that our model outperforms previous methods and achieves state-of-the-art results on ADE, TACRED, and TACREV.
Funder
National Natural Science Foundation of China
Subject
Fluid Flow and Transfer Processes,Computer Science Applications,Process Chemistry and Technology,General Engineering,Instrumentation,General Materials Science
Reference39 articles.
1. Zeng, D., Liu, K., Lai, S., Zhou, G., and Zhao, J. (2014, January 23–29). Relation classification via convolutional deep neural network. Proceedings of the COLING 2014, the 25th International Conference on Computational Linguistics: Technical Papers, Dublin, Ireland.
2. Zhou, P., Shi, W., Tian, J., Qi, Z., Li, B., Hao, H., and Xu, B. (2016, January 7–12). Attention-based bidirectional long short-term memory networks for relation classification. Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, Berlin, Germany.
3. PTR: Prompt Tuning with Rules for Text Classification;Han;AI Open,2022
4. Miwa, M., and Bansal, M. (2016, January 7–12). End-to-End Relation Extraction using LSTMs on Sequences and Tree Structures. Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, Berlin, Germany.
5. Adel, H., and Schütze, H. (2017, January 7–11). Global Normalization of Convolutional Neural Networks for Joint Entity and Relation Classification. Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, Copenhagen, Denmark.
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Context-aware generative prompt tuning for relation extraction;International Journal of Machine Learning and Cybernetics;2024-06-17