Affiliation:
1. School of Electrical Engineering, Korea Advanced Institute of Science and Technology (KAIST), Daejeon, South Korea
Funder
National Research Foundation of Korea
Ministry of Science and ICT, South Korea
Non-volatile Memory Cluster Academia Collaboration Program
Publisher
Institute of Electrical and Electronics Engineers (IEEE)
Subject
Computational Theory and Mathematics,Hardware and Architecture,Theoretical Computer Science,Software
Reference37 articles.
1. ALBERT: A lite BERT for self-supervised learning of language representations;lan;Proc 8th Int Conf Learn Representations,2020
2. HAT: Hardware-Aware Transformers for Efficient Natural Language Processing
3. RoBERTa: A robustly optimized BERT pretraining approach;liu,2019
4. AdapterHub: A Framework for Adapting Transformers
5. A Survey of the Usages of Deep Learning for Natural Language Processing