Author:
Aguilar Gustavo,Ling Yuan,Zhang Yu,Yao Benjamin,Fan Xing,Guo Chenlei
Abstract
Knowledge distillation is typically conducted by training a small model (the student) to mimic a large and cumbersome model (the teacher). The idea is to compress the knowledge from the teacher by using its output probabilities as soft-labels to optimize the student. However, when the teacher is considerably large, there is no guarantee that the internal knowledge of the teacher will be transferred into the student; even if the student closely matches the soft-labels, its internal representations may be considerably different. This internal mismatch can undermine the generalization capabilities originally intended to be transferred from the teacher to the student. In this paper, we propose to distill the internal representations of a large model such as BERT into a simplified version of it. We formulate two ways to distill such representations and various algorithms to conduct the distillation. We experiment with datasets from the GLUE benchmark and consistently show that adding knowledge distillation from internal representations is a more powerful method than only using soft-label distillation.
Publisher
Association for the Advancement of Artificial Intelligence (AAAI)
Cited by
41 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Optimization Algorithm of Visual Multimodal Text Recognition for Public Opinion Analysis Scenarios;Computational and Experimental Simulations in Engineering;2024
2. Multimodal Deep Learning with Boosted Trees for Edge Inference;2023 IEEE International Conference on Data Mining Workshops (ICDMW);2023-12-04
3. Joint knowledge graph approach for event participant prediction with social media retweeting;Knowledge and Information Systems;2023-11-27
4. FCNet: Learning Noise-Free Features for Point Cloud Denoising;IEEE Transactions on Circuits and Systems for Video Technology;2023-11
5. Enhancing Spectrogram for Audio Classification Using Time-Frequency Enhancer;2023 Asia Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC);2023-10-31