Affiliation:
1. Department of Mathematics, University of California–Davis, Davis, California 95616;
2. Department of Statistics, University of California–Davis, Davis, California 95616
Abstract
Functionally constrained stochastic optimization problems, where neither the objective function nor the constraint functions are analytically available, arise frequently in machine learning applications. In this work, assuming we only have access to the noisy evaluations of the objective and constraint functions, we propose and analyze stochastic zeroth-order algorithms for solving this class of stochastic optimization problem. When the domain of the functions is [Formula: see text], assuming there are m constraint functions, we establish oracle complexities of order [Formula: see text] and [Formula: see text] in the convex and nonconvex settings, respectively, where ϵ represents the accuracy of the solutions required in appropriately defined metrics. The established oracle complexities are, to our knowledge, the first such results in the literature for functionally constrained stochastic zeroth-order optimization problems. We demonstrate the applicability of our algorithms by illustrating their superior performance on the problem of hyperparameter tuning for sampling algorithms and neural network training.Funding: K. Balasubramanian was partially supported by a seed grant from the Center for Data Science and Artificial Intelligence Research, University of California–Davis, and the National Science Foundation [Grant DMS-2053918].
Publisher
Institute for Operations Research and the Management Sciences (INFORMS)
Reference87 articles.
1. Agarwal A, Dekel O, Xiao L (2010) Optimal algorithms for online convex optimization with multi-point bandit feedback. Kalai AT, Mohri M, eds.23rd Conf. Learn. Theory(Omnipress, Madison, WI), 28–40.
2. Efficient solution of quadratically constrained quadratic subproblems within the mesh adaptive direct search algorithm
3. Bayesian Optimization and Data Science
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献