Open Access
February 2018 Optimal control of branching diffusion processes: A finite horizon problem
Julien Claisse
Ann. Appl. Probab. 28(1): 1-34 (February 2018). DOI: 10.1214/17-AAP1290

Abstract

In this paper, we aim to develop the stochastic control theory of branching diffusion processes where both the movement and the reproduction of the particles depend on the control. More precisely, we study the problem of minimizing the expected value of the product of individual costs penalizing the final position of each particle. In this setting, we show that the value function is the unique viscosity solution of a nonlinear parabolic PDE, that is, the Hamilton–Jacobi–Bellman equation corresponding to the problem. To this end, we extend the dynamic programming approach initiated by Nisio [J. Math. Kyoto Univ. 25 (1985) 549–575] to deal with the lack of independence between the particles as well as between the reproduction and the movement of each particle. In particular, we exploit the particular form of the optimization criterion to derive a weak form of the branching property. In addition, we provide a precise formulation and a detailed justification of the adequate dynamic programming principle.

Citation

Download Citation

Julien Claisse. "Optimal control of branching diffusion processes: A finite horizon problem." Ann. Appl. Probab. 28 (1) 1 - 34, February 2018. https://doi.org/10.1214/17-AAP1290

Information

Received: 1 September 2016; Revised: 1 December 2016; Published: February 2018
First available in Project Euclid: 3 March 2018

zbMATH: 06873678
MathSciNet: MR3770871
Digital Object Identifier: 10.1214/17-AAP1290

Subjects:
Primary: 60J60 , 60J80 , 93E20
Secondary: 49L20 , 49L25 , 60J70 , 60J85

Keywords: branching diffusion process , dynamic programming principle , Hamilton–Jacobi–Bellman equation , Stochastic control , viscosity solution

Rights: Copyright © 2018 Institute of Mathematical Statistics

Vol.28 • No. 1 • February 2018
Back to Top