Article ID: | iaor200935011 |
Country: | United States |
Volume: | 31 |
Issue: | 1 |
Start Page Number: | 50 |
End Page Number: | 84 |
Publication Date: | Feb 2006 |
Journal: | Mathematics of Operations Research |
Authors: | NioMora Jos |
Keywords: | queues: theory |
This paper presents a framework grounded on convex optimization and economics ideas to solve by index policies problems of optimal dynamic allocation of effort to a discrete–state (finite or countable) binary–action (work/rest) semi–Markov restless bandit project, elucidating issues raised by previous work. Its contributions include: (i) the concept of a restless bandit's marginal productivity index (MPI), characterizing optimal policies relative to general cost and work measures; (ii) the characterization of indexable restless bandits as those satisfying diminishing marginal returns to work, consistently with a nested family of threshold policies; (iii) sufficient indexability conditions via partial conservation laws (PCLs); (iv) the characterization of the MPI as an optimal marginal productivity rate relative to feasible active–state sets; (v) application to semi–Markov bandits under several criteria, including a new mixed average–bias criterion; and (vi) PCL–indexability analyses and MPIs for optimal service control of make–to–order/make–to–stock queues with convex holding costs, under discounted and average–bias criteria.