Article ID: | iaor2016756 |
Volume: | 5 |
Issue: | 1 |
Start Page Number: | 6 |
End Page Number: | 12 |
Publication Date: | Mar 2016 |
Journal: | Health Systems |
Authors: | Huynh Nathan, Snyder Rita, Vidal Jos, Cai Bo |
Keywords: | research, statistics: inference, statistics: empirical, quality & reliability, information theory |
Direct observation of complex health‐care processes typically involves multi‐observer recording of sequential process tasks. Inference, the key validity threat to multi‐observer recording, is controlled with observer training and assessment for the degree of recording consistency across observers. The gold standard for assessing recording consistency is the Kappa statistic, which assumes an exact task sequence match among observers. This assumption, however, is often difficult to meet with health‐care process observations where task speed and complexity can result in uneven task sequence recording among observers. The edit distance approach, derived from information string theory, is not predicated on an exact task sequence match and offers an alternative to the Kappa statistic for assessing multi‐observer agreement. The paper uses simultaneously recorded process observations with uneven task sequences made by three observers to compare agreement results for the edit distance approach and Kappa statistic. Edit distance approach strengths and limitations are discussed.