Author:
Wang Haotao,Gui Shupeng,Yang Haichuan,Liu Ji,Wang Zhangyang
Publisher
Springer International Publishing
Reference67 articles.
1. Bulò, S.R., Porzi, L., Kontschieder, P.: Dropout distillation. In: International Conference on Machine Learning, pp. 99–107 (2016)
2. Chen, H., et al.: Frequency domain compact 3D convolutional neural networks. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 1641–1650 (2020)
3. Chen, H., et al.: Distilling portable generative adversarial networks for image translation. In: AAAI Conference on Artificial Intelligence (2020)
4. Chen, H., Wang, Y., Xu, C., Xu, C., Tao, D.: Learning student networks via feature embedding. IEEE Trans. Neural Netw. Learn. Syst. (2020)
5. Chen, H., et al.: Data-free learning of student networks. In: IEEE International Conference on Computer Vision, pp. 3514–3522 (2019)
Cited by
31 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. StyleWA: adaptive discriminator-based wavelet knowledge distillation;Journal of Electronic Imaging;2024-07-16
2. SimSwap++: Towards Faster and High-Quality Identity Swapping;IEEE Transactions on Pattern Analysis and Machine Intelligence;2024-01
3. More Teachers Make Greater Students: Compression of CycleGAN;IFIP Advances in Information and Communication Technology;2024
4. Enhancing GAN Compression by Image Probability Distribution Distillation;Pattern Recognition and Computer Vision;2023-12-28
5. Exploring the Optimal Bit Pair for a Quantized Generator and Discriminator;2023 IEEE International Conference on Visual Communications and Image Processing (VCIP);2023-12-04