Prompt Optimization in Large Language Models

Author:

Sabbatella Antonio1,Ponti Andrea2ORCID,Giordani Ilaria3,Candelieri Antonio2ORCID,Archetti Francesco1

Affiliation:

1. Department of Computer Science, Systems and Communications, University of Milan-Bicocca, 20126 Milan, Italy

2. Department of Economics, Management, and Statistics, University of Milan-Bicocca, 20126 Milan, Italy

3. Oaks srl, 20125 Milan, Italy

Abstract

Prompt optimization is a crucial task for improving the performance of large language models for downstream tasks. In this paper, a prompt is a sequence of n-grams selected from a vocabulary. Consequently, the aim is to select the optimal prompt concerning a certain performance metric. Prompt optimization can be considered as a combinatorial optimization problem, with the number of possible prompts (i.e., the combinatorial search space) given by the size of the vocabulary (i.e., all the possible n-grams) raised to the power of the length of the prompt. Exhaustive search is impractical; thus, an efficient search strategy is needed. We propose a Bayesian Optimization method performed over a continuous relaxation of the combinatorial search space. Bayesian Optimization is the dominant approach in black-box optimization for its sample efficiency, along with its modular structure and versatility. We use BoTorch, a library for Bayesian Optimization research built on top of PyTorch. Specifically, we focus on Hard Prompt Tuning, which directly searches for an optimal prompt to be added to the text input without requiring access to the Large Language Model, using it as a black-box (such as for GPT-4 which is available as a Model as a Service). Albeit preliminary and based on “vanilla” Bayesian Optimization algorithms, our experiments with RoBERTa as a large language model, on six benchmark datasets, show good performances when compared against other state-of-the-art black-box prompt optimization methods and enable an analysis of the trade-off between the size of the search space, accuracy, and wall-clock time.

Publisher

MDPI AG

Reference32 articles.

1. Archetti, A., and Candelieri, A. (2019). Bayesian Optimization and Data Science, Springer International Publishing.

2. Garnett, R. (2023). Bayesian Optimization, Cambridge University Press.

3. BoTorch: A framework for efficient Monte-Carlo Bayesian optimization;Balandat;Adv. Neural Inf. Process. Syst.,2020

4. Brian, L., Al-Rfou, R., and Constant, N. (2021). The power of scale for parameter-efficient prompt tuning. arXiv.

5. Chain-of-thought prompting elicits reasoning in large language models;Wei;Adv. Neural Inf. Process. Syst.,2022

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3