December 2015 Finite-horizon optimality for continuous-time Markov decision processes with unbounded transition rates
Xianping Guo, Xiangxiang Huang, Yonghui Huang
Author Affiliations +
Adv. in Appl. Probab. 47(4): 1064-1087 (December 2015). DOI: 10.1239/aap/1449859800

Abstract

In this paper we focus on the finite-horizon optimality for denumerable continuous-time Markov decision processes, in which the transition and reward/cost rates are allowed to be unbounded, and the optimality is over the class of all randomized history-dependent policies. Under mild reasonable conditions, we first establish the existence of a solution to the finite-horizon optimality equation by designing a technique of approximations from the bounded transition rates to unbounded ones. Then we prove the existence of ε (≥ 0)-optimal Markov policies and verify that the value function is the unique solution to the optimality equation by establishing the analog of the Itô-Dynkin formula. Finally, we provide an example in which the transition rates and the value function are all unbounded and, thus, obtain solutions to some of the unsolved problems by Yushkevich (1978).

Citation

Download Citation

Xianping Guo. Xiangxiang Huang. Yonghui Huang. "Finite-horizon optimality for continuous-time Markov decision processes with unbounded transition rates." Adv. in Appl. Probab. 47 (4) 1064 - 1087, December 2015. https://doi.org/10.1239/aap/1449859800

Information

Published: December 2015
First available in Project Euclid: 11 December 2015

zbMATH: 1330.90125
MathSciNet: MR3433296
Digital Object Identifier: 10.1239/aap/1449859800

Subjects:
Primary: 90C40
Secondary: 60J27 , 93E20

Keywords: continuous-time Markov decision process , finite-horizon criterion , optimal Markov policy , randomized history-dependent policy , unbounded transition rate

Rights: Copyright © 2015 Applied Probability Trust

JOURNAL ARTICLE
24 PAGES

This article is only available to subscribers.
It is not available for individual sale.
+ SAVE TO MY LIBRARY

Vol.47 • No. 4 • December 2015
Back to Top