Article ID: | iaor20063679 |
Country: | Netherlands |
Volume: | 168 |
Issue: | 2 |
Start Page Number: | 509 |
End Page Number: | 519 |
Publication Date: | Jan 2006 |
Journal: | European Journal of Operational Research |
Authors: | Down Douglas G., Lewis Mark E. |
Keywords: | markov processes, programming: dynamic |
We consider a system of parallel queues with dedicated arrival streams. At each decision epoch a decision-maker can move customers from one queue to another. The cost for moving customers consists of a fixed cost and a linear, variable cost dependent on the number of customers moved. There are also linear holding costs that may depend on the queue in which customers are stored. Under very mild assumptions, we develop stability (and instability) conditions for this system via a fluid model. Under the assumption of stability, we consider minimizing the long-run average cost. In the case of two servers the optimal control policy is show to prefer to store customers in the lowest cost queue. When the inter-arrival and service times are assumed to be exponential, we use a Markov decision process formulation to show that for a fixed number of customers in the system, there exists a level