1. Linformer: Self-attention with linear complexity;wang,2020
2. Blockwise self-attention for long document understanding;qiu,2019
3. Longformer: The long-document transformer;beltagy,2020
4. Generating long sequences with sparse transformers;child,2019
5. MAGNet: A Modular Accelerator Generator for Neural Networks