1. Abdullah DM, Chali Y (2020) Towards generating query to perform query focused abstractive summarization using pre-trained model. In: Proceedings of the 13th International conference on natural language generation, pp 80–85
2. Akbik A, Blythe D, Vollgraf R (2018) Contextual string embeddings for sequence labeling. In: Proceedings of the 27th International conference on computational linguistics, pp 1638–1649
3. Devlin J, Chang MW, Lee K, Toutanova K (2018) Bert: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805
4. Dong L, Yang N, Wang W, Wei F, Liu X, Wang Y, Gao J, Zhou M, Hon HW (2019) Unified language model pre-training for natural language understanding and generation. arXiv preprint arXiv:1905.03197
5. Farahani M, Gharachorloo M, Manthouri M (2021) Leveraging parsBERT and pretrained mT5 for Persian abstractive text summarization. In: 2021 26th International computer conference, computer society of Iran (CSICC). IEEE, New York, pp 1–6