1. Attention-Informed Mixed-Language Training for Zero-Shot Cross-Lingual Task-Oriented Dialogue Systems
2. Simple Data Augmentation for Multilingual NLU in Task Oriented Dialogue Systems
3. Tinybert: Distilling bert for natural language understanding;jiao,2019
4. Distilbert, a distilled version of bert: smaller, faster, cheaper and lighter;sanh,2019
5. (almost) zero-shot cross-lingual spoken language understanding;tur;2018 IEEE International Conference on Acoustics Speech and Signal Processing (ICASSP),2018