Affiliation:
1. Ohio State University, Columbus, OH
Abstract
In this article we discuss the application of a certain class of Monte Carlo methods to stochastic optimization problems. Particularly, we study
variable-sample
techniques, in which the objective function is replaced,
at each iteration
, by a sample average approximation. We first provide general results on the
schedule
of sample sizes, under which variable-sample methods yield consistent estimators as well as bounds on the estimation error. Because the convergence analysis is performed pathwisely, we are able to obtain our results in a flexible setting, which requires mild assumptions on the distributions and which includes the possibility of using different sampling distributions along the algorithm. We illustrate these ideas by studying a modification of the well-known
pure random search
method, adapting it to the variable-sample scheme, and show conditions for convergence of the algorithm. Implementation issues are discussed and numerical results are presented to illustrate the ideas.
Publisher
Association for Computing Machinery (ACM)
Subject
Computer Science Applications,Modelling and Simulation
Reference39 articles.
1. Allen T. Ittiwattana W. and Bernshteyn M. 2002. An elitist genetic algorithm incorporating sequential subset selection. Manuscript Ohio State University.]] Allen T. Ittiwattana W. and Bernshteyn M. 2002. An elitist genetic algorithm incorporating sequential subset selection. Manuscript Ohio State University.]]
2. A simulated annealing algorithm with constant temperature for discrete stochastic optimization. Manage;Alrefaei M. H.;Sci.,1999
3. A modification of the stochastic ruler method for discrete stochastic optimization
4. A method for discrete stochastic optimization. Manage;Andradóttir S.;Sci.,1995
5. A Global Search Method for Discrete Stochastic Optimization
Cited by
95 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献