1. Unlimiformer: Long-range transformers with unlimited length input;Bertsch,2023
2. AREDSUM: Adaptive redundancy-aware iterative sentence ranking for extractive document summarization;Bi,2020
3. On attention redundancy: A comprehensive study;Bian,2021
4. Latent dirichlet allocation;Blei,2003
5. AWESOME: GPU memory-constrained long document summarization using memory mechanism and global salient content;Cao;Computation and Language,2023