Article ID: | iaor20133653 |
Volume: | 35 |
Issue: | 3 |
Start Page Number: | 691 |
End Page Number: | 710 |
Publication Date: | Jul 2013 |
Journal: | OR Spectrum |
Authors: | Makis Viliam, Jiang Rui, Kim Michael |
Keywords: | markov processes, control processes |
In this paper, we propose a new model for availability maximization under partial observations for maintenance applications. Compared with the widely studied cost minimization models, few structural results are known about the form of the optimal control policy for availability maximization models. We consider a failing system with unobservable operational states. Only the failure state is observable. System deterioration is driven by an unobservable, continuous‐time homogeneous Markov process. Multivariate condition monitoring data which is stochastically related to the unobservable state of the system is collected at equidistant sampling epochs and is used to update the posterior state distribution for decision making. Preventive maintenance can be carried out at any sampling epoch, and corrective maintenance is carried out upon system failure. The objective is to determine the form of the optimal control policy that maximizes the long‐run expected average availability per unit time. We formulate the problem as an optimal stopping problem with partial information. Under standard assumptions, we prove that a control limit policy is optimal. A computational algorithm is developed, illustrated by numerical results.