Funder
National Social Science Fund of China
National Office for Philosophy and Social Sciences
Subject
Artificial Intelligence,Computer Science Applications,General Engineering
Reference47 articles.
1. Aralikatte, R., Narayan, S., Maynez, J., Rothe, S., & McDonald, R. (2021). Focus Attention: Promoting Faithfulness and Diversity in Summarization. In Proceedings of the 59th annual meeting of the association for computational linguistics and the 11th international joint conference on natural language processing (volume 1: Long papers) (pp. 6078–6095).
2. Layer normalization;Ba,2016
3. Longformer: The long-document transformer;Beltagy,2020
4. A comprehensive review of automatic text summarization techniques: method, data, evaluation and coding;Cajueiro,2023
5. CLIFF: Contrastive learning for improving faithfulness and factuality in abstractive summarization;Cao,2021
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献