Article ID: | iaor20133938 |
Volume: | 206 |
Issue: | 1 |
Start Page Number: | 401 |
End Page Number: | 423 |
Publication Date: | Jul 2013 |
Journal: | Annals of Operations Research |
Authors: | Kolisch Rainer, Schtz Hans-Jrg |
Keywords: | service, programming: markov decision |
We consider a problem where different classes of customers can book different types of services in advance and the service company has to respond immediately to the booking request confirming or rejecting it. Due to the possibility of cancellations before the day of service, or no‐shows at the day of service, overbooking the given capacity is a viable decision. The objective of the service company is to maximize profit made of class‐type specific revenues, refunds for cancellations or no‐shows as well as the cost of overtime. For the calculation of the latter, information of the underlying appointment schedule is required. Throughout the paper we will relate the problem to capacity allocation in radiology services. Drawing upon ideas from revenue management, overbooking, and appointment scheduling we model the problem as a Markov decision process in discrete time which due to proper aggregation can be optimally solved with an iterative stochastic dynamic programming approach. In an experimental study we successfully apply the approach to a real world problem with data from the radiology department of a hospital. Furthermore, we compare the optimal policy to four heuristic policies, of whom one is currently in use. We can show that the optimal policy significantly improves the currently used policy and that a nested booking limit type policy closely approximates the optimal policy and is thus recommended for use in practice.