1. Dineshnath, G., Saraswathi, S.: Comprehensive survey for abstractive text summarization. Int. J. Innov. Adv. Comput. Sci. 7, 215–219 (2018)
2. Rush, A.M., Chopra, S., Jason, W.: A neural attention model for abstractive sentence summarization. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, Lisbon, Portugal, 17–21, pp. 379–389 (2015)
3. Chopra, S., Michael, A., Rush, A.M.: Abstractive sentence summarization with attentive recurrent neural networks. In: Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 93–98 (2016)
4. Ramesh, N., Bowen, Z., Cicero dos, S., Gulehre, C., Bing, X.: Abstractive text Summarization using sequence-to-sequence RNNs and beyond. In: Proceedings of the 20th SIGNLL Conference on CoNLL, pp. 280–290 (2016)
5. Marc’Aurelio, R., Sumit, C., Michael, A., Wojciech, Z.: Sequence Level Training with Recurrent Neural Networks. arXiv:1511.06732v7 [cs. LG] (2016)