1. Multi-task pretraining for plug-and-play task-oriented dialogue system;su;Proceedings annual meeting of the Association for Computational Linguistics,2022
2. Deep speech 2: End-to-end speech recognition in english and mandarin;amodei;International Conference on Machine Learning,2016
3. BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
4. Exploring the limits of transfer learning with a unified text-to-text transformer;raffel;Journal of Machine Learning Research,2020
5. Wavenet: A generative model for raw audio;den oord,2016