Affiliation:
1. Department of Computer Science, Systems and Communications, University of Milan-Bicocca, 20126 Milan, Italy
2. Department of Economics, Management, and Statistics, University of Milan-Bicocca, 20126 Milan, Italy
3. Oaks srl, 20125 Milan, Italy
Abstract
Prompt optimization is a crucial task for improving the performance of large language models for downstream tasks. In this paper, a prompt is a sequence of n-grams selected from a vocabulary. Consequently, the aim is to select the optimal prompt concerning a certain performance metric. Prompt optimization can be considered as a combinatorial optimization problem, with the number of possible prompts (i.e., the combinatorial search space) given by the size of the vocabulary (i.e., all the possible n-grams) raised to the power of the length of the prompt. Exhaustive search is impractical; thus, an efficient search strategy is needed. We propose a Bayesian Optimization method performed over a continuous relaxation of the combinatorial search space. Bayesian Optimization is the dominant approach in black-box optimization for its sample efficiency, along with its modular structure and versatility. We use BoTorch, a library for Bayesian Optimization research built on top of PyTorch. Specifically, we focus on Hard Prompt Tuning, which directly searches for an optimal prompt to be added to the text input without requiring access to the Large Language Model, using it as a black-box (such as for GPT-4 which is available as a Model as a Service). Albeit preliminary and based on “vanilla” Bayesian Optimization algorithms, our experiments with RoBERTa as a large language model, on six benchmark datasets, show good performances when compared against other state-of-the-art black-box prompt optimization methods and enable an analysis of the trade-off between the size of the search space, accuracy, and wall-clock time.
Reference32 articles.
1. Archetti, A., and Candelieri, A. (2019). Bayesian Optimization and Data Science, Springer International Publishing.
2. Garnett, R. (2023). Bayesian Optimization, Cambridge University Press.
3. BoTorch: A framework for efficient Monte-Carlo Bayesian optimization;Balandat;Adv. Neural Inf. Process. Syst.,2020
4. Brian, L., Al-Rfou, R., and Constant, N. (2021). The power of scale for parameter-efficient prompt tuning. arXiv.
5. Chain-of-thought prompting elicits reasoning in large language models;Wei;Adv. Neural Inf. Process. Syst.,2022