Author:
Gryaznov Artem,Rybka Roman,Moloshnikov Ivan,Selivanov Anton,Sboev Alexander
Reference16 articles.
1. A Hierarchical Structured Self-Attentive Model for Extractive Document Summarization (HSSAS)
2. Bukhtiyarov, A., Gusev, I., 2020. Advances of transformer-based models for news headline generation, in: Conference on Artificial Intelligence and Natural Language, Springer. pp. 54–61.
3. Celikyilmaz, A., Bosselut, A., He, X., Choi, Y., 2018. Deep communicating agents for abstractive summarization. arXiv preprint arXiv:1803.10357.
4. Gavrilov, D., Kalaidin, P., Malykh, V., 2019. Self-attentive model for headline generation, in: European Conference on Information Retrieval, Springer. pp. 87–93.
5. Gusev, I., 2019. Importance of copying mechanism for news headline generation. arXiv preprint arXiv:1904.11475.
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献