1. BERT: Pre-training of deep bidirectional transformers for language understanding;Devlin,2019
2. Roberta: A robustly optimized bert pretraining approach;Liu,2019
3. KPT++: Refined knowledgeable prompt tuning for few-shot text classification;Ni;Knowl.-Based Syst.,2023
4. Enhancing low-resource neural machine translation with syntax-graph guided self-attention;Gong;Knowl.-Based Syst.,2022
5. BERT post-training for review reading comprehension and aspect-based sentiment analysis;Xu,2019