Affiliation:
1. Marmara University, Turkey
Abstract
Abstractive summarization aims to comprehend texts semantically and reconstruct them briefly and concisely where the summary may consist of words that do not exist in the original text. This chapter studies the abstractive Turkish text summarization problem by a transformer attention-based mechanism. Moreover, this study examines the differences between transformer architecture and other architectures as well as the attention block, which is the heart of this architecture, in detail. Three summarization datasets were generated from the available text data on various news websites for training abstractive summarization models. It is shown that the trained model has higher or comparable ROUGE scores than existing studies, and the summaries generated by models have better structural properties. English-to-Turkish translation model has been created and used in a cross-lingual summarization model which has a ROUGE score that is comparable to the existing studies. The summarization structure proposed in this study is the first example of cross-lingual English-to-Turkish text summarization.
Reference35 articles.
1. A tree-based approach for English-to-Turkish translation
2. Beltagy, I., Peters, M., & Cohan, A. (2020). Longformer: The Long-Document Transformer. Arxiv:2004.05150.
3. Curriculum Learning.;Y.Bengio;ICML ’09: Proceedings of the 26th Annual International Conference on Machine Learning,2009
4. Celikyilmaz, A., Bosselut, A., He, X., & Choi, Y. (2018). Deep Communicating Agents for Abstractive Summarization. Arxiv: 1803.10357.
5. Taspinar, E. K., Yetis, Y. B., & Cihan, O. (2022). Abstractive Turkish Text Summarization Using Transformer and Cross-Lingual Summarization.https://eymenkagantaspinar.github.io/Abstractive-Turkish-Text-Summarization-Using-Transformer-and-Cross-Lingual-Summarization/
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献