alexa
Reach Us +1-947-333-4405
Reference Guidelines Improve Residentsandrsquo; Ability to Order Appropriate Preoperative Tests in Standardized Case Scenarios | OMICS International
ISSN: 2155-6148
Journal of Anesthesia & Clinical Research

Like us on:

Make the best use of Scientific Research and information from our 700+ peer reviewed, Open Access Journals that operates with the help of 50,000+ Editorial Board Members and esteemed reviewers and 1000+ Scientific associations in Medical, Clinical, Pharmaceutical, Engineering, Technology and Management Fields.
Meet Inspiring Speakers and Experts at our 3000+ Global Conferenceseries Events with over 600+ Conferences, 1200+ Symposiums and 1200+ Workshops on
Medical, Pharma, Engineering, Science, Technology and Business

Reference Guidelines Improve Residents’ Ability to Order Appropriate Preoperative Tests in Standardized Case Scenarios

Andrew Goldberg1 *, Daniel Katz2 , Hung-Mo Lin3 and Samuel DeMaria Jr4

1,4Department of Anesthesiology, Mount Sinai Medical Center, New York, USA

2Medical Student, Mount Sinai School of Medicine, New York, USA

3Associate Professor, Mount Sinai School of Medicine, New York, USA

*Corresponding Author:
Andrew Goldberg
Department of Anesthesiology
Mount Sinai Medical Center
One Gustave L. Levy Place
Box 406, New York, NY 10029, USA
Tel: (212)241-7475
Fax: (212)426-2009
E-mail: [email protected]

Received Date: August 12, 2013; Accepted Date: September 04, 2013; Published Date: September 06, 2013

Citation: Goldberg A, Katz D, Lin HM, Jr SD (2013) Reference Guidelines Improve Residents’ Ability to Order Appropriate Preoperative Tests in Standardized Case Scenarios. J Anesth Clin Res 4:353. doi: 10.4172/2155-6148.1000353

Copyright: © 2013 Goldberg A, et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Visit for more related articles at Journal of Anesthesia & Clinical Research

Abstract

Background: Preoperative testing for surgery is estimated to cost $30 billion annually. The goal of this study was to determine the relative influence of access to a guideline reference for preoperative test ordering appropriateness by resident physicians in simulated case scenarios.

Methods: At a single teaching hospital, 80 PGY (Post Medical School Graduation Year) 2-5 residents from anesthesiology, surgery, internal medicine, and obstetrics/gynecology were recruited to review simulated case scenarios. Participants within each specialty were randomized with half receiving supplemental ASA preoperative testing guidelines during completion of the questionnaire. Participants indicated which preoperative tests they believed appropriate for each scenario. Correct responses were set by an expert panel and results were reported as relative probabilities and 95% CI.

Results: 66 surveys were analyzed. In the entire cohort, the group receiving supplemental guidelines achieved a greater percentage of correct answers (x=84.2%) compared to the group without guidelines (x=78.6%) (relprob =1.07 [CI 1.01-1.12], p=0.011). Correct answers improved to 1.07 [1.01-1.12] with a guideline across specialties and experience levels. Without a guideline, correct answer rates were greater for anesthesia vs surgery residents (1.19 [1.08, 1.31]) and anesthesia vs internal medicine residents (1.16 [1.04, 1.31]). With guidelines, these differences were maintained. Without a guideline, significant differences were noted between PGY 3 vs PGY 2 residents (1.12 [1.03, 1.23]) and PGY 4 vs PGY 2 residents (1.11 [1.03, 1.20]), but these differences were not present with guidelines. Surgery residents did not improve with the guideline.

Conclusions: In a set of simulated clinical scenarios, reference to ASA-adapted guidelines improved test ordering by the majority of resident physicians. While anesthesia residents performed better than others independent of the guideline, the guideline negated the effect of experience in non-anesthesia trainees. Given the financial burden of inappropriate preoperative test ordering, further validation of the benefits of guideline implementation is warranted.

Introduction

The evolution of surgical practice over the last two decades has altered the practice of preoperative evaluation such that preoperative assessments occur in multiple, completely different settings. Consequently, many patients are not evaluated by an attending anesthesiologist at all prior to surgery. It has been estimated that greater than 50% of patients do not visit a preadmission testing center, even though these have been showed to be efficient and costeffective [1-9]. Preoperative testing for surgery is estimated to account for approximately $30 billion in health care costs annually in the US and a majority of these tests may be unnecessary [10-12]. Therefore, deficiencies in knowledge of test ordering have major financial implications. While it is impractical to evaluate all patients in dedicated preoperative centers, there still exists a need to curtail the ordering of unnecessary and expensive tests in the preoperative period [13].

In teaching institutions, patients are frequently evaluated preoperatively by resident physicians training in internal medicine, surgery, obstetrics/gynecology, or anesthesiology. Given the varying knowledge domains and experience levels of resident physicians, it has been demonstrated that disparities exist in residents’ general knowledge of preoperative testing, and of specific guidelines, such as that published by the American Society of Anesthesiologists (ASA) [14-16]. Thus, strategies to improve the quality of preoperative evaluation by resident physicians should be developed.

The goal of this preliminary investigation was to measure the effect of access to a preoperative testing guideline on preoperative test and consultation ordering patterns of resident physicians in simulated clinical scenarios developed by a panel of senior attending anesthesiologists. We further sought to investigate the influence of varying specialties and levels of experience by resident physicians.

Materials and Methods

Design

The trial created was a randomized, prospective design taking place over a one month period at Mount Sinai Medical Center. Following Institutional Review Board approval of waiver of signed consent, all eighty resident physicians from the Departments of Anesthesiology, Surgery, Internal Medicine, and Obstetrics and Gynecology (Ob/ Gyn) at The Mount Sinai Medical Center were recruited to participate voluntarily. The study was limited to resident physicians in the PGY (Post Medical School Graduation Year) 2 through PGY 5 years of training. PGY 1 residents were excluded as at Sinai this is the year residents learn proper preoperative evaluation. Therefore, all senior residents (PGY2-5) would be familiar with the proper preoperative evaluation and work-up. The eighty residents enrolled were obtained through voluntary participation at respective departmental grand rounds. The participants were instructed to use this reference (if provided) to help with their answer choices. If the reference was not provided, participants were instructed that responses should be based on their existing medical knowledge.”

The questionnaire of the six case scenarios was distributed to residents of the four different specialties during grand rounds. Each participant received the same questionnaire consisting of the six unique case scenarios followed by ten identical testing items from which to choose in order to prepare the hypothetical patient for surgery. Using an electronic randomized number generator SDM gave half of the study participants from each specialty received a preoperative testing guideline with the questionnaires (Figure 1). The participants were instructed to use this reference (if provided) to help with their answer choices, but otherwise just their existing medical knowledge.

anesthesia-clinical-testing-guide

Figure 1: Preoperative testing guide.

Questionnaire development

Six simulated case scenarios were written by one of the authors (SDM) to encompass a spectrum of healthy through chronically ill patients (ASA Physical Status 1-4) scheduled to undergo procedures of variable risk (from colonoscopy through major vascular surgery). The scenarios were designed to test knowledge of test-ordering strategies that are consistent with ASA Taskforce on Pre-anesthesia Evaluation recommendations regarding laboratory evaluations, electrocardiograms and chest radiographs [17]. A few of the guidelines were adapted to include institutional policies. For example ECG required for patients >50 years old is not required by the ASA (while they do recommend ECG for patients with cardiac risk factors and that risk factors are likely higher in elderly patients). Also pregnancy testing criteria was changed from patients where care may be altered to patients in whom pregnancy may complicate surgery. This adaptation was made to include all the sub-specialties who are now providing preoperative care. The authors determined that those specialists may not understand that care may be altered from an anesthesiology perspective and to cue them to the importance of the test the wording should be changed. Medical specialist consultations were also included as potential answers.

A panel of five associate or full professor academic anesthesiologists, each with at least ten years of clinical experience, reviewed the case scenarios. The panel defined correct testing choices for each of the scenarios as per the ASA guideline and defined the appropriateness of available medical consultations based on their opinion of what a reasonable clinician would do. An answer was considered correct if the majority considered it reasonable for the hypothetical patient in question. In all circumstances, at least four of five panelists concurred on the correct choices for each scenario. The clinical scenarios and correct answers are detailed in the Appendix.

Statistical analysis

Each preoperative test was scored as appropriately ordered (true positive), inappropriately ordered (false positive), appropriately omitted (true negative), or inappropriately omitted (false negative), as interpreted by the anesthesia panel. Five types of rates were calculated: the correct answer rate (true positives+true negatives), the true positive rate, the true negative rate, the false positive rate and the false negative rate. The rates were examined overall, by specialty and by PGY level. Multiple linear or Poisson regression analyses were performed to determine the influence of the main participant variables (experience, specialty, and guideline reference availability) on the five types of rates. Pair wise two-way interactions and three-way interaction were tested. A p value <0.05 was considered statistically significant. The GENMOD procedure in SAS v9.1 (SAS Inc., Cary, NC) was used to calculate relative probabilities. Relative probability rates (relprob) are reported as the estimate with 95% confidence intervals.

Results

Of the 80 residents who received a questionnaire, 100% completed the clinical cases and returned it to the study group. However, 14 (17.5%) did not fill in the demographics and were therefore excluded from final analysis. The data analysis was based on responses from these participants, whose demographic data are presented in Table 1. Table 2 shows the overall correct answer rates as averages across the six scenarios, and the relative probability of having a correct answer, by ASA guide availability, specialty, and years of experience.

Specialty PGY 2 PGY 3 PGY 4 PGY 5 Total
Anesthesiology 0 9 6 0 15
Surgery 3 2 5 2 12
Internal Medicine 12 10 1 0 23
OB/GYN 9 4 3 0 16
Total 24 25 15 2 66

Table 1: Participants by specialty and PGY level.

    Comparison of correct answer rate
Group   Overall Correct Answers Groups Compared Rel Prob* 95% CI p-value
Guide Without 78.6% -- -- -- --
With 84.2% With vs.without 1.07 (1.02, 1.12) 0.011
Specialty Surgery 73.8% Anes vs surg 1.22 (1.14, 1.30) <.001
Ob/Gyn 84.1% Anes vs ObGyn 1.07 (0.99, 1.16) 0.082
Internal Medicine 78.9% Anes vs IntMed 1.14 (1.06, 1.22) <.001
Anesthesiology 90.0% ObGyn vs Surg 1.14 (1.04, 1.26) 0.009
    IntMed vs Surg 1,07 (0.98, 1.17) 0.154
    IntMed vs ObGyn 0.94 (0.87, 1.01) 0.093
Level of Experience PGY 2 79.8% PGY 3 vs 2 1.02 (0.96, 1.09) 0.532
PGY 3 81.5% PGY 4+ vs 2 1.04 (0.98, 1.11) 0.201
PGY 4+ 83.1% PGY 4+ vs 3 1.02 (0.96, 1.08) 0.496

Table 2: Overall probability of correct answer rates by guide, specialty and PGY level.

Overall, correct answer rates were approximately 7% greater (relatively) for those with guidelines than those without. Among those respondents not receiving guidelines, anesthesia residents answered correctly more often than surgery or internal medicine residents and PGY 2 residents answered correctly less often than either PGY 3-5 residents. Among those with guides, residents in anesthesiology answered correctly more often than those in either surgery or internal medicine, and Ob/Gyn and internal medicine residents answered correctly more often than did surgery residents.

When experience or specialty groups were compared based on guideline availability, several other findings became apparent. Of those residents without a guideline, significant differences in correct answer rate probability were noted between PGY 3 over PGY 2 residents (relprob=1.12 [1.03, 1.23], p=<.001) and PGY 4 over PGY 2 residents (relprob=1.11 [1.03, 1.20], p=<0.001). These differences were not present if the residents of different experience levels received guidelines (p=0.1931 and 0.5377 for PGY 3 versus PGY 2 and PGY 4 versus PGY 2, respectively). Of those without a guideline, correct answer rates were greater for anesthesia versus surgery residents (relprob=1.19 [1.08, 1.31], p<.001) and anesthesia versus internal medicine residents (relprob=1.16 [1.04, 1.31], p=0.010). Of those with guidelines, statistically significant differences were maintained between anesthesia versus surgery residents (relprob=1.29 [1.18, 1.40], p=<.001) and anesthesia versus internal medicine residents (relprob=1.15 [1.05, 1.25], p=0.002). New differences were found between internal medicine versus surgery residents (relprob=1.12 [1.01, 1.25], p=0.028) and Ob/ Gyn versus surgery residents (relprob=1.21 [1.07, 1.37], p=0.002).

Table 3 shows the overall correct answer rates for each group, stratified by guideline availability. When a guide was available, except for surgery, absolute correct answer rates tended to be improved for all specialties by approximately 6%. The rates were also improved for PGY 2 residents, but not PGY 3-5 residents (p=0.031 for interaction between years of experience and guideline availability). Modeling the calculated rates (true positive, true negative, false positive, false negative) by department or PGY level with the presence or absence of a guide revealed differences between more experienced residents and less experienced residents for the false positive rate (i.e., ordered but unnecessary) (p=0.044) (Table 4).

  Group Without Guidelines With Guidelines p-value
Specialty Surgery 73.7%  72.6% 0.816
Anesthesiology 87.7%  93.4% 0.099
Internal Medicine 75.4%  81.5% 0.164
Ob/Gyn 80.1%  87.8% 0.078
Level of Experience PGY 2 73.4%  85.7% <.001
PGY 3 82.5%    81.2% 0.657
PGY 4+ 81.5%    83.4% 0.569

Table 3: Overall correct answer rate, stratified by Guide.

  Without guidelines With guidelines
  Department (pr>chisq) PGY level (pr>chisq) Department (pr>chisq) PGY level (pr>chisq)
True Positive 0.069 0.092 0.338 0.305
True Negative 0.079 0.056 0.081 0.957
False Positive 0.052 0.044 0.098 0.897
False Negative 0.065 0.093 0.262 0.160

Table 4: Modeling true positive, false positive, true negative and false negative rate probabilities across all scenarios by specialty and PGY level with or without the use of a guide.

Discussion

The results of the current investigation demonstrated that an inexpensive reference guide improved preoperative test ordering practices. Interestingly, the influence of inexperience (lower PGY level), which was statistically significant among those without a guide, appeared to be absolved by the use of a guide. This suggests that having an easy to use reference enhanced less experienced practitioners’ performance.

While the four measured rates (true positive, true negative, false positive, false negative) were not statistically significant among residents of different specialties or experience levels (except for the false positive rate of less experienced residents without a guideline), the trends we observed in these data are worth noting. For those residents without a guideline, specialty and experience levels were associated with probabilities approaching but not achieving significance for all four of the rates. When given a guideline, these probabilities were much further from statistical significance, suggesting that guideline use may neutralize the influence of specialty and/or experience on preoperative ordering practices of resident physicians.

The major limitation of this study is the relatively small sample size obtained from a single teaching institution. Though statistical significance was reached for many of our measures, the practices of 66 residents are not necessarily generalizable to national resident physician preoperative testing practices. Another limitation of this study is that it is based on the results of hypothetical, paper-based scenarios, which may not translate directly to clinical practice. The fact that a panel of anesthesiologists determined “correct” answers also introduces subjectivity into the calculation of answer rates, even if these practitioners used evidence-based medicine and best practices, and nearly always achieved 100% consensus. Additionally, we could not determine whether those given a guideline actually used it to determine their answers. This could explain why the guideline appeared to have no positive effect in surgery residents (in contrast with the other specialties).

The results of our data are consistent with previous work demonstrating that preoperative testing patterns vary among physicians of different specialties and levels of experience [12,13]. Roizen stated that a condition considered optimal for daily life may not be optimal for surgery [18]. This, however, does not mean that all patients need the same number or types of tests. According to one study of patients over 70 years of age, “routine” preoperative testing was of little benefit [17]. Guidelines may also be of benefit in private offices and specialty clinics regularly engaged in ordering “batteries” of tests for preoperative care. This has been shown through the distribution of guidelines to nonanesthesiologists [19].

In light of the significant results of the current preliminary study and the economic impact of appropriate test ordering, further investigation is warranted to replicate these findings in larger cohorts and in actual clinical practice. The national trends towards implementation of electronic health records provide an opportunity to implement evidence-based guidelines. One such effort could be directed towards improving physicians’ preoperative test ordering patterns in a fashion that enhances quality and is economically responsible.

Acknowledgements

The authors wish to acknowledge the residents who volunteered their time to participate in this study and the expert panel of anesthesiologists at Mount Sinai Medical Center: Jeffrey Silverstein, MD; Andrew Leibowitz, MD; Meg Rosenblatt, MD; Adam Levine, MD; and Francine Yudkowitz, MD.

Author Attestations

Andrew Goldberg contributed substantially to all aspects of this manuscript, including conception and design; acquisition, analysis, and interpretation of data and drafting the article. Daniel Katz contributed substantially to all aspects of this manuscript, including conception and design; acquisition, analysis, and interpretation of data and drafting the article. Hung-Mo Lin contributed substantially to the analysis and interpretation of data.

Samuel DeMaria contributed substantially to all aspects of this manuscript, including conception and design; acquisition, analysis, and interpretation of data and drafting the article. All authors agree to publish the data reported in this article.

References

Select your language of interest to view the total content in your interested language
Post your comment

Share This Article

Article Usage

  • Total views: 11670
  • [From(publication date):
    September-2013 - Nov 14, 2018]
  • Breakdown by view type
  • HTML page views : 7907
  • PDF downloads : 3763
 

Post your comment

captcha   Reload  Can't read the image? click here to refresh

Peer Reviewed Journals
 
Make the best use of Scientific Research and information from our 700 + peer reviewed, Open Access Journals
International Conferences 2018-19
 
Meet Inspiring Speakers and Experts at our 3000+ Global Annual Meetings

Contact Us

Agri and Aquaculture Journals

Dr. Krish

[email protected]

+1-702-714-7001Extn: 9040

Biochemistry Journals

Datta A

[email protected]

1-702-714-7001Extn: 9037

Business & Management Journals

Ronald

[email protected]

1-702-714-7001Extn: 9042

Chemistry Journals

Gabriel Shaw

[email protected]

1-702-714-7001Extn: 9040

Clinical Journals

Datta A

[email protected]

1-702-714-7001Extn: 9037

Engineering Journals

James Franklin

[email protected]

1-702-714-7001Extn: 9042

Food & Nutrition Journals

Katie Wilson

[email protected]

1-702-714-7001Extn: 9042

General Science

Andrea Jason

[email protected]

1-702-714-7001Extn: 9043

Genetics & Molecular Biology Journals

Anna Melissa

[email protected]

1-702-714-7001Extn: 9006

Immunology & Microbiology Journals

David Gorantl

[email protected]

1-702-714-7001Extn: 9014

Materials Science Journals

Rachle Green

[email protected]

1-702-714-7001Extn: 9039

Nursing & Health Care Journals

Stephanie Skinner

[email protected]

1-702-714-7001Extn: 9039

Medical Journals

Nimmi Anna

[email protected]

1-702-714-7001Extn: 9038

Neuroscience & Psychology Journals

Nathan T

[email protected]

1-702-714-7001Extn: 9041

Pharmaceutical Sciences Journals

Ann Jose

[email protected]

1-702-714-7001Extn: 9007

Social & Political Science Journals

Steve Harry

[email protected]

1-702-714-7001Extn: 9042

 
© 2008- 2018 OMICS International - Open Access Publisher. Best viewed in Mozilla Firefox | Google Chrome | Above IE 7.0 version