Affiliation:
1. School of Electrical Engineering, Korea Advanced Institute of Science and Technology (KAIST), Daejeon, South Korea
Funder
National Research Foundation of Korea
Ministry of Science and ICT, South Korea
Non-volatile Memory Cluster Academia Collaboration Program
Publisher
Institute of Electrical and Electronics Engineers (IEEE)
Subject
Computational Theory and Mathematics,Hardware and Architecture,Theoretical Computer Science,Software
Reference37 articles.
1. ALBERT: A lite BERT for self-supervised learning of language representations;lan;Proc 8th Int Conf Learn Representations,2020
2. HAT: Hardware-Aware Transformers for Efficient Natural Language Processing
3. RoBERTa: A robustly optimized BERT pretraining approach;liu,2019
4. AdapterHub: A Framework for Adapting Transformers
5. A Survey of the Usages of Deep Learning for Natural Language Processing
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献