Funder
Basic and Applied Basic Research Foundation of Guangdong Province
National Natural Science Foundation of China
Subject
Electrical and Electronic Engineering,Hardware and Architecture,Human-Computer Interaction
Reference42 articles.
1. Distilling the knowledge in a neural network;Hinton,2015
2. W. Park, D. Kim, Y. Lu, M. Cho, Relational knowledge distillation, in: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2019, pp. 3967–3976.
3. N. Passalis, A. Tefas, Learning deep representations with probabilistic knowledge transfer, in: Proceedings of the European Conference on Computer Vision (ECCV), 2018, pp. 268–284.
4. F. Tung, G. Mori, Similarity-preserving knowledge distillation, in: Proceedings of the IEEE/CVF International Conference on Computer Vision, 2019, pp. 1365–1374.
5. Paying more attention to attention: Improving the performance of convolutional neural networks via attention transfer;Zagoruyko,2016