Abstract
Recently P. L. Lions has demonstrated the connection between the value function of stochastic optimal control and a viscosity solution of Hamilton-Jacobi-Bellman equation [cf. 10, 11, 12]. The purpose of this paper is to extend partially his results to stochastic differential games, where two players conflict each other. If the value function of stochatic differential game is smooth enough, then it satisfies a second order partial differential equation with max-min or min-max type nonlinearity, called Isaacs equation [cf. 5]. Since we can write a nonlinear function as min-max of appropriate affine functions, under some mild conditions, the stochastic differential game theory provides some convenient representation formulas for solutions of nonlinear partial differential equations [cf. 1, 2, 3].
Publisher
Cambridge University Press (CUP)
Reference17 articles.
1. Optimal control of diffusion processes and Hamilton-Jacobi-Bellman equations, part 1;Lions;The dynamic programming principle and applications, Comm. P. D. E,1983
2. A uniqueness result for the semigroup associated with the Hamilton-Jacobi-Belman operator
3. Controlled Diffusion Processes
Cited by
32 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献