Article ID: | iaor20134975 |
Volume: | 10 |
Issue: | 3 |
Start Page Number: | 200 |
End Page Number: | 224 |
Publication Date: | Sep 2013 |
Journal: | Decision Analysis |
Authors: | Alagoz Oguzhan, Chhatwal Jagpreet, Burnside Elizabeth S |
Keywords: | programming: markov decision |
Mammography is the most effective screening tool for early diagnosis of breast cancer. Based on the mammography findings, radiologists need to choose from one of the following three alternatives: (1) take immediate diagnostic actions including prompt biopsy to confirm breast cancer; (2) recommend a follow‐up mammogram; (3) recommend routine annual mammography. There are no validated structured guidelines based on a decision‐analytical framework to aid radiologists in making such patient‐management decisions. Surprisingly, only 15–45% of the breast biopsies and less than 1% of short‐interval follow‐up recommendations are found to be malignant, resulting in unnecessary tests and patient anxiety. We develop a finite‐horizon discrete‐time Markov decision process (MDP) model that may help radiologists make patient‐management decisions to maximize a patient's total expected quality‐adjusted life years. We use clinical data to find the policies recommended by the MDP model and also compare them to decisions made by radiologists at a large mammography practice. We also derive the structural properties of the MDP model, including sufficiency conditions that ensure the existence of a double control limit‐type policy.