Towards Energy Efficient DNN accelerator via Sparsified Gradual Knowledge Distillation
Author:
Affiliation:
1. Georgia Institute of Technology,School of Electrical and Computer Engineering,Atlanta,Georgia,30332
Funder
Semiconductor Research Corporation
Publisher
IEEE
Link
http://xplorestaging.ieee.org/ielx7/9939277/9939284/09939619.pdf?arnumber=9939619
Reference25 articles.
1. Knowledge from the original network: restore a better pruned network with knowledge distillation;chen;Complex & Intelligent Systems,2021
2. Learning from Multiple Teacher Networks
3. Knowledge Distillation for Optimization of Quantized Deep Neural Networks
4. Effective Training of Convolutional Neural Networks with Low-bitwidth Weights and Activations
5. PQK: Model Compression via Pruning, Quantization, and Knowledge Distillation
Cited by 4 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Denseflex: A Low Rank Factorization Methodology for Adaptable Dense Layers in DNNs;Proceedings of the 21st ACM International Conference on Computing Frontiers;2024-05-07
2. Twofold Sparsity: Joint Bit- and Network-Level Sparsity for Energy-Efficient Deep Neural Network Using RRAM Based Compute-In-Memory;IEEE Access;2024
3. Memory-Based Computing for Energy-Efficient AI: Grand Challenges;2023 IFIP/IEEE 31st International Conference on Very Large Scale Integration (VLSI-SoC);2023-10-16
4. Towards Highly Compressed CNN Models for Human Activity Recognition in Wearable Devices;2023 Signal Processing: Algorithms, Architectures, Arrangements, and Applications (SPA);2023-09-20
1.学者识别学者识别
2.学术分析学术分析
3.人才评估人才评估
"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370
www.globalauthorid.com
TOP
Copyright © 2019-2024 北京同舟云网络信息技术有限公司 京公网安备11010802033243号 京ICP备18003416号-3