1. Fast methods for denoising matrix completion formulations, with applications to robust seismic data interpolation;Aravkin;SIAM J. Sci. Comput.,2014
2. Invertible residual networks;Behrmann,2019
3. Reversible architectures for arbitrarily deep residual neural networks;Chang,2018
4. Exploiting linear structure within convolutional networks for efficient evaluation;Denton,2014
5. Ding, C., Liao, S., Wang, Y., Li, Z., Liu, N., Zhuo, Y., Wang, C., Qian, X., Bai, Y., Yuan, G., Ma, X., Zhang, Y., Tang, J., Qiu, Q., Lin, X., Yuan, B., 2017. Circnn: Accelerating and compressing deep neural networks using block-circulant weight matrices. In: 2017 50th Annual IEEE/ACM International Symposium on Microarchitecture. MICRO, pp. 395–408.