Affiliation:
1. Zhejiang University,Hangzhou,China
Funder
National Key Research and Development Program of China
National Natural Science Foundation of China
Reference55 articles.
1. Bert: Pre-training of deep bidirectional transformers for language understanding;Kenton
2. Llama: Open and efficient foundation language models;Touvron,2023
3. Language models are unsupervised multitask learners;Radford;OpenAI blog,2019
4. RLPrompt: Optimizing Discrete Text Prompts with Reinforcement Learning
5. AutoPrompt: Eliciting Knowledge from Language Models with Automatically Generated Prompts