Affiliation:
1. Department of Computer Science, Emory University, Atlanta, GA
2. Visa Research, Palo Alto, CA
Abstract
Recent advances in deep learning have facilitated the demand of neural models for real applications. In practice, these applications often need to be deployed with limited resources while keeping high accuracy. This paper touches the core of neural models in NLP, word embeddings, and presents an embedding distillation framework that remarkably reduces the dimension of word embeddings without compromising accuracy. A new distillation ensemble approach is also proposed that trains a high-efficient student model using multiple teacher models. In our approach, the teacher models play roles only during training such that the student model operates on its own without getting supports from the teacher models during decoding, which makes it run as fast and light as any single model. All models are evaluated on seven document classification datasets and show significant advantage over the teacher models for most cases. Our analysis depicts insightful transformation of word embeddings from distillation and suggests a future direction to ensemble approaches using neural models.
Publisher
International Joint Conferences on Artificial Intelligence Organization
Cited by
9 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Word and Character Semantic Fusion by Pretrained Language Models for Text Classification;2024 International Joint Conference on Neural Networks (IJCNN);2024-06-30
2. Sentiment Analysis using DistilBERT;2023 IEEE 11th Conference on Systems, Process & Control (ICSPC);2023-12-16
3. Using BERT with Different Deep Learning Techniques to Classify Reviews;2023 5th International Conference on Advances in Computing, Communication Control and Networking (ICAC3N);2023-12-15
4. Combining static BERT embedding and TCN-CNN ensemble for text classification;2023 5th International Conference on Advances in Computing, Communication Control and Networking (ICAC3N);2023-12-15
5. Impact of word embedding models on text analytics in deep learning environment: a review;Artificial Intelligence Review;2023-02-22