Publisher
Springer Science and Business Media LLC
Subject
Law,Artificial Intelligence
Reference44 articles.
1. Beltagy I, Peters ME, Cohan A (2020) Longformer: The long-document transformer arXiv preprint arXiv:200405150
2. Cao Z, Wei F, Li W, Li S (2018) Faithful to the original: fact aware neural abstractive summarization. In: thirty-second AAAI conference on artificial intelligence
3. Carbonell J, Goldstein J (1998) The use of mmr, diversity-based reranking for reordering documents and producing summaries. In: proceedings of the 21st annual international ACM SIGIR conference on research and development in information retrieval, pp 335–336
4. Chandrasekaran MK, Yasunaga M, Radev D, Freitag D, Kan MY (2019) Overview and results: Cl-scisumm shared task 2019. In: in proceedings of joint workshop on bibliometric-enhanced information retrieval and NLP for digital libraries (BIRNDL 2019)
5. Child R, Gray S, Radford A, Sutskever I (2019) Generating long sequences with sparse transformers. arXiv preprint arXiv:190410509
Cited by
14 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献