1. Alford S et al (2018) Pruned and structurally sparse neural networks. CoRR abs/1810.00299. arXiv: 1810.00299
2. Zhu M, Gupta S (2017) To prune, or not to prune: exploring the efficacy of pruning for model compression. arXiv preprint arXiv:1710.01878
3. Wen W, He Y, Rajbhandari S, Zhang M, Wang W, Liu F, Hu B, Chen Y, Li H (2017) Learning intrinsic sparse structures within long short-term memory. arXiv preprint arXiv:1709.05027
4. Furuya T, Suetake K, Taniguchi K, Kusumoto H, Saiin R, Daimon T (2021) Spectral pruning for recurrent neural networks. arXiv:2105.10832v1
5. Mao H et al (2017) Exploring the regularity of sparse structure in convolutional neural networks. arXiv preprint arXiv:1705.08922