Dynamic load balancing in parallel queuing systems: Stability and optimal control

Dynamic load balancing in parallel queuing systems: Stability and optimal control

0.00 Avg rating0 Votes
Article ID: iaor20063679
Country: Netherlands
Volume: 168
Issue: 2
Start Page Number: 509
End Page Number: 519
Publication Date: Jan 2006
Journal: European Journal of Operational Research
Authors: ,
Keywords: markov processes, programming: dynamic
Abstract:

We consider a system of parallel queues with dedicated arrival streams. At each decision epoch a decision-maker can move customers from one queue to another. The cost for moving customers consists of a fixed cost and a linear, variable cost dependent on the number of customers moved. There are also linear holding costs that may depend on the queue in which customers are stored. Under very mild assumptions, we develop stability (and instability) conditions for this system via a fluid model. Under the assumption of stability, we consider minimizing the long-run average cost. In the case of two servers the optimal control policy is show to prefer to store customers in the lowest cost queue. When the inter-arrival and service times are assumed to be exponential, we use a Markov decision process formulation to show that for a fixed number of customers in the system, there exists a level S such that whenever customers are moved from the high cost queue to the low cost queue, the number of customers moved brings the number of customers in the low cost queue to S. These results lead to the development of a heurisitic for the model with more than two servers.

Reviews

Required fields are marked *. Your email address will not be published.