Author:
Zhou Wangchunshu,Xu Canwen,McAuley Julian
Publisher
Association for Computational Linguistics
Cited by
16 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Knowledge Distillation with Perturbed Loss: From a Vanilla Teacher to a Proxy Teacher;Proceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining;2024-08-24
2. Multilingual Meta-Distillation Alignment for Semantic Retrieval;Proceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval;2024-07-10
3. Learning From Human Educational Wisdom: A Student-Centered Knowledge Distillation Method;IEEE Transactions on Pattern Analysis and Machine Intelligence;2024-06
4. WLEDD: Legal judgment prediction with legal feature word subgraph label-embedding and dual-knowledge distillation;Journal of Intelligent & Fuzzy Systems;2024-03-22
5. Adapt Your Teacher: Improving Knowledge Distillation for Exemplar-free Continual Learning;2024 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV);2024-01-03