Once Quantization-Aware Training: High Performance Extremely Low-bit Architecture Search
Author:
Affiliation:
1. Sensetime Research
2. University of Oxford
3. The University of Sydney
Funder
Australian Research Council
Publisher
IEEE
Link
http://xplorestaging.ieee.org/ielx7/9709627/9709628/09711104.pdf?arnumber=9711104
Reference41 articles.
1. Slimmable neural networks;yu,2018
2. Bignas: Scaling up neural architecture search with big single-stage models;yu,2020
3. Efficientnet: Rethinking model scaling for convolutional neural networks;tan,2019
4. Robust quantization: One model to rule them all;shkolnik,2020
5. Searching for Accurate Binary Neural Architectures
Cited by 14 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. TensorRT Powered Model for Ultra-Fast Li-Ion Battery Capacity Prediction on Embedded Devices;Energies;2024-06-07
2. MATAR: Multi-Quantization-Aware Training for Accurate and Fast Hardware Retargeting;2024 Design, Automation & Test in Europe Conference & Exhibition (DATE);2024-03-25
3. Stabilized activation scale estimation for precise Post-Training Quantization;Neurocomputing;2024-02
4. An Automatic Neural Network Architecture-and-Quantization Joint Optimization Framework for Efficient Model Inference;IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems;2024
5. QuantNAS for Super Resolution: Searching for Efficient Quantization-Friendly Architectures Against Quantization Noise;IEEE Access;2024
1.学者识别学者识别
2.学术分析学术分析
3.人才评估人才评估
"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370
www.globalauthorid.com
TOP
Copyright © 2019-2024 北京同舟云网络信息技术有限公司 京公网安备11010802033243号 京ICP备18003416号-3