TCohPrompt: task-coherent prompt-oriented fine-tuning for relation extraction

Author:

Long Jun,Yin Zhuoying,Liu Chao,Huang Wenti

Abstract

AbstractPrompt-tuning has emerged as a promising approach for improving the performance of classification tasks by converting them into masked language modeling problems through the insertion of text templates. Despite its considerable success, applying this approach to relation extraction is challenging. Predicting the relation, often expressed as a specific word or phrase between two entities, usually requires creating mappings from these terms to an existing lexicon and introducing extra learnable parameters. This can lead to a decrease in coherence between the pre-training task and fine-tuning. To address this issue, we propose a novel method for prompt-tuning in relation extraction, aiming to enhance the coherence between fine-tuning and pre-training tasks. Specifically, we avoid the need for a suitable relation word by converting the relation into relational semantic keywords, which are representative phrases that encapsulate the essence of the relation. Moreover, we employ a composite loss function that optimizes the model at both token and relation levels. Our approach incorporates the masked language modeling (MLM) loss and the entity pair constraint loss for predicted tokens. For relation level optimization, we use both the cross-entropy loss and TransE. Extensive experimental results on four datasets demonstrate that our method significantly improves performance in relation extraction tasks. The results show an average improvement of approximately 1.6 points in F1 metrics compared to the current state-of-the-art model. Codes are released at https://github.com/12138yx/TCohPrompt.

Funder

Department of Education of Hunan province

National Natural Science Foundation of China

Publisher

Springer Science and Business Media LLC

Reference41 articles.

1. Alt C, Gabryszak A, Hennig L (2020) TACRED revisited: a thorough evaluation of the TACRED relation extraction task. In: Proceedings of the 58th annual meeting of the association for computational linguistics, pp 1558–1569

2. Bordes A, Usunier N, Garcia-Duran A et al (2013) Translating embeddings for modeling multi-relational data. Adv Neural Inf Process Syst 26:2787–2795

3. Brown T, Mann B, Ryder N et al (2020) Language models are few-shot learners. Adv Neural Inf Process Syst 33:1877–1901

4. Chen X, Li L, Zhang N et al (2022) Relation extraction as open-book examination: retrieval-enhanced prompt tuning. In: Proceedings of the 45th international ACM SIGIR conference on research and development in information retrieval, pp 2443–2448

5. Chen X, Zhang N, Xie X et al (2022) KnowPrompt: knowledge-aware prompt-tuning with synergistic optimization for relation extraction. Proc ACM Web Conf 2022:2778–2788

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3