1. Exploring the limits of transfer learning with a unified text-to-text transformer;raffel;J Mach Learn Res,2020
2. Unified Pre-training for Program Understanding and Generation
3. BERT: Pre-training of deep bidirectional transformers for language understanding;devlin;The North American Chapter of the Associationfor Computational Linguistics Human Language Technologies,2019
4. CoTexT: Multi-task Learning with Code-Text Transformer
5. Pretrain, prompt, and predict: A systematic survey of prompting methods in natural language processing;liu;arXiv 2107 13586,2021