Publisher
Springer Nature Switzerland
Reference32 articles.
1. Aksenov, D., Schneider, J.M., Bourgonje, P., Schwarzenberg, R., Hennig, L., Rehm, G.: Abstractive text summarization based on language model conditioning and locality modeling. In: Proceedings of the 12th Language Resources and Evaluation Conference, pp. 6680–6689 (2020)
2. Aralikatte, R., Narayan, S., Maynez, J., Rothe, S., McDonald, R.: Focus attention: promoting faithfulness and diversity in summarization. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pp. 6078–6095 (2021)
3. Beltagy, I., Peters, M.E., Cohan, A.: Longformer: the long-document transformer. arXiv preprint arXiv:2004.05150 (2020)
4. Cheng, J., Lapata, M.: Neural summarization by extracting sentences and words. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, pp. 484–494 (2016)
5. Cho, S., Song, K., Wang, X., Liu, F., Yu, D.: Toward unifying text segmentation and long document summarization. In: Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pp. 106–118 (2022)