Author:
Matoglu Melda Ormeci,Vate John H. Vande,Yu Haiyue
Abstract
AbstractIn this paper we introduce and solve a generalization of the classic average cost Brownian control problem in which a system manager dynamically controls the drift rate of a diffusion process X. At each instant, the system manager chooses the drift rate from a pair {u, v} of available rates and can invoke instantaneous controls either to keep X from falling or to keep it from rising. The objective is to minimize the long-run average cost consisting of holding or delay costs, processing costs, costs for invoking instantaneous controls, and fixed costs for changing the drift rate. We provide necessary and sufficient conditions on the cost parameters to ensure the problem admits a finite optimal solution. When it does, a simple control band policy specifying economic buffer sizes (α, Ω) and up to two switching points is optimal. The controller should invoke instantaneous controls to keep X in the interval (α, Ω). A policy with no switching points relies on a single drift rate exclusively. When there is no cost to change the drift rate, a policy with a single switching point s indicates that the controller should change to the slower drift rate when X exceeds s and use the faster drift rate otherwise. When there is a cost to change the drift rate, a policy with two switching points s < S indicates that the controller should maintain the faster drift rate until X exceeds S and maintain the slower drift rate until X falls below s.
Publisher
Cambridge University Press (CUP)
Subject
Applied Mathematics,Statistics and Probability
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献