Funder
Ministry of Human Resource Development
Publisher
Springer Science and Business Media LLC
Reference30 articles.
1. Al-Sabahi K, Zuping Z, Nadher M (2018) A hierarchical structured self-attentive model for extractive document summarization (hssas). IEEE Access 6:24205–24212
2. Baralis E, Cagliero L, Mahoto N et al (2013) Graphsum: discovering correlations among multiple terms for graph-based summarization. Inf Sci 249:96–109
3. Cao Z, Li W, Li S et al (2016) Attsum: joint learning of focusing and summarization with neural attention. In: Proceedings of COLING 2016, the 26th international conference on computational linguistics: technical papers, pp 547–556
4. Chen X, Gao S, Tao C et al (2018) Iterative document representation learning towards summarization with polishing. In: Proceedings of the 2018 conference on empirical methods in natural language processing, pp 4088–4097
5. Cheng J, Lapata M (2016) Neural summarization by extracting sentences and words. In: Proceedings of the 54th annual meeting of the association for computational linguistics (volume 1: long papers), pp 484–494
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献