Author:
Xiong Jing-Wen,Mao Xian-Ling,Yang Yizhe,Huang Heyan
Abstract
Abstract
Scientific faceted summarization is a task to generate four summaries for a scientific article from different facets, including Purpose, Method, Findings, and Value. Existing works usually generate summary independently for each facet using pre-training or prompt-training paradigms. However, these works tend to produce duplicate content among different facets of the same scientific article, because they do not consider the relation among the aforementioned four facets. To solve the redundancy problem, we propose a novel Contrastive Prompt Learning method to Reduce redundancy for Scientific Faceted Summarization, named CPLR-SFS, to generate concise and less-overlapping faceted summaries. Specifically, CPLR-SFS receives the facet-specific prompt to guide the generation and utilizes the faceted contrastive loss for better distinguishing different faceted summaries in semantic space. Extensive experiments on the FacetSum dataset demonstrate that the proposed model can generate better faceted summaries than the state-of-the-art baselines with less redundancy.
Subject
Computer Science Applications,History,Education