Article ID: | iaor200942226 |
Country: | United States |
Volume: | 56 |
Issue: | 6 |
Start Page Number: | 1507 |
End Page Number: | 1525 |
Publication Date: | Nov 2008 |
Journal: | Operations Research |
Authors: | Queyranne Maurice, Puterman Martin L, Patrick Jonathan |
Keywords: | queues: applications, programming: dynamic |
We present a method to dynamically schedule patients with different priorities to a diagnostic facility in a public health–care setting. Rather than maximizing revenue, the challenge facing the resource manager is to dynamically allocate available capacity to incoming demand to achieve wait–time targets in a cost–effective manner. We model the scheduling process as a Markov decision process. Because the state space is too large for a direct solution, we solve the equivalent linear program through approximate dynamic programming. For a broad range of cost parameter values, we present analytical results that give the form of the optimal linear value function approximation and the resulting policy. We investigate the practical implications and the quality of the policy through simulation.