Optimal control of Markovian jump processes with partial information and applications to a parallel queueing model

Optimal control of Markovian jump processes with partial information and applications to a parallel queueing model

0.00 Avg rating0 Votes
Article ID: iaor200972015
Country: Germany
Volume: 70
Issue: 3
Start Page Number: 567
End Page Number: 596
Publication Date: Dec 2009
Journal: Mathematical Methods of Operations Research
Authors: ,
Abstract:

We consider a stochastic control problem over an infinite horizon where the state process is influenced by an unobservable environment process. In particular, the Hidden-Markov-model and the Bayesian model are included. This model under partial information is transformed into an equivalent one with complete information by using the well-known filter technique. In particular, the optimal controls and the value functions of the original and the transformed problem are the same. An explicit representation of the filter process which is a piecewise-deterministic process, is also given. Then we propose two solution techniques for the transformed model. First, a generalized verification technique (with a generalized Hamilton–Jacobi–Bellman equation) is formulated where the strict differentiability of the value function is weaken to local Lipschitz continuity. Second, we present a discrete-time Markovian decision model by which we are able to compute an optimal control of our given problem. In this context we are also able to state a general existence result for optimal controls. The power of both solution techniques is finally demonstrated for a parallel queueing model with unknown service rates. In particular, the filter process is discussed in detail, the value function is explicitly computed and the optimal control is completely characterized in the symmetric case.

Reviews

Required fields are marked *. Your email address will not be published.