Author(s): Brown SH, Speroff T, Fielstein EM, Bauer BA, WahnerRoedler DL,
Abstract Share this page
Abstract OBJECTIVE: To evaluate an electronic quality (eQuality) assessment tool for dictated disability examination records. METHODS: We applied automated concept-based indexing techniques to automated quality screening of Department of Veterans Affairs spine disability examinations that had previously undergone gold standard quality review by human experts using established quality indicators. We developed automated quality screening rules and refined them iteratively on a training set of disability examination reports. We applied the resulting rules to a novel test set of spine disability examination reports. The initial data set was composed of all electronically available examination reports (N=125,576) finalized by the Veterans Health Administration between July and September 2001. RESULTS: Sensitivity was 91\% for the training set and 87\% for the test set (P-.02). Specificity was 74\% for the training set and 71\% for the test set (P=.44). Human performance ranged from 4\% to 6\% higher (P<.001) than the eQuality tool in sensitivity and 13\% to 16\% higher in specificity (P<.001). In addition, the eQuality tool was equivalent or higher in sensitivity for 5 of 9 individual quality indicators. CONCLUSION: The results demonstrate that a properly authored computer-based expert systems approach can perform quality measurement as well as human reviewers for many quality indicators. Although automation will likely always rely on expert guidance to be accurate and meaningful, eQuality is an important new method to assist clinicians in their efforts to practice safe and effective medicine.
This article was published in Mayo Clin Proc
and referenced in Journal of Health & Medical Informatics