Author(s): Dave Davis, Mary Ann Thomson OBrien, Nick Freemantle, Fredric M Wolf, Paul Mazmanian
Context Although physicians report spending a considerable amount of time in continuing medical education (CME) activities, studies have shown a sizable difference between real and ideal performance, suggesting a lack of effect of formal CME.
Objective To review, collate, and interpret the effect of formal CME interventions on physician performance and health care outcomes. Data Sources Sources included searches of the complete Research and Development Resource Base in Continuing Medical Education and the Specialised Register of the Cochrane Effective Practice and Organisation of Care Group, supplemented by searches of MEDLINE from 1993 to January 1999.
Study Selection Studies were included in the analyses if they were randomized controlled trials of formal didactic and/or interactive CME interventions (conferences, courses, rounds, meetings, symposia, lectures, and other formats) in which at least 50% of the participants were practicing physicians. Fourteen of 64 studies identified met these criteria and were included in the analyses. Articles were reviewed independently by 3 of the authors.
Data Extraction Determinations were made about the nature of the CME intervention (didactic, interactive, or mixed), its occurrence as a 1-time or sequenced event, and other information about its educational content and format. Two of 3 reviewers independently applied all inclusion/exclusion criteria. Data were then subjected to meta-analytic techniques.
Data Synthesis The 14 studies generated 17 interventions fitting our criteria. Nine generated positive changes in professional practice, and 3 of 4 interventions altered health care outcomes in 1 or more measures. In 7 studies, sufficient data were available for effect sizes to be calculated; overall, no significant effect of these educational methods was detected (standardized effect size, 0.34; 95% confidence interval [CI], −0.22 to 0.97). However, interactive and mixed educational sessions were associated with a significant effect on practice (standardized effect size, 0.67; 95% CI, 0.01-1.45). Conclusions Our data show some evidence that interactive CME sessions that enhance participant activity and provide the opportunity to practice skills can effect change in professional practice and, on occasion, health care outcomes. Based on a small number of well-conducted trials, didactic sessions do not appear to be effective in changing physician performance.