Author:
Wang Yongwei, ,Shen Tao,Zhang Shengyu,Wu Fan,Zhao Zhou,Cai Haibin,Lyu Chengfei,Ma Lizhuang,Yang Chenglei,Wu Fei, , , , ,
Publisher
Aerospace Information Research Institute, Chinese Academy of Sciences
Reference173 articles.
1. Afonin A and Karimireddy S P. 2022. Towards model agnostic federated learning using knowledge distillation//Proceedings of the 10th International Conference on Learning Representations. San Diego, USA:ICLR:1-23
2. Ahn S,Hu S X,Damianou A,Lawrence N D and Dai Z. 2019. Variational information distillation for knowledge transfer//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. Long Beach,USA:IEEE:9163-9171[DOI:10.1109/CVPR.2019.00938]
3. Allen-Zhu Z and Li Y Z. 2020. Towards understanding ensemble,knowledge distillation and self-distillation in deep learning//Proceedings of the 11th International Conference on Learning Representations. Kigali,Rwanda:OpenReview.net:1-12
4. Asadi N and Goudarzi M. 2024. Variant parallelism:lightweight deep convolutional models for distributed inference on IoT devices. IEEE Internet of Things Journal,11(1):345-352[DOI:10.1109/JIOT. 2023.3285877]
5. Banitalebi-Dehkordi A,Vedula N,Pei J,Xia F,Wang L J and Zhang Y. 2021. Auto-split:a general framework of collaborative edgecloud AI//Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. Singapore, Singapore:ACM:2543-2553[DOI:10.1145/3447548.3467078]