alexa The Implementation Fidelity Tracker: Development and Dissemination of an Audit-Feedback Tool to Evaluate Implementation-Based Healthcare Efforts | OMICS International
ISSN 2155-6113
Journal of AIDS & Clinical Research

Like us on:

Make the best use of Scientific Research and information from our 700+ peer reviewed, Open Access Journals that operates with the help of 50,000+ Editorial Board Members and esteemed reviewers and 1000+ Scientific associations in Medical, Clinical, Pharmaceutical, Engineering, Technology and Management Fields.
Meet Inspiring Speakers and Experts at our 3000+ Global Conferenceseries Events with over 600+ Conferences, 1200+ Symposiums and 1200+ Workshops on
Medical, Pharma, Engineering, Science, Technology and Business

The Implementation Fidelity Tracker: Development and Dissemination of an Audit-Feedback Tool to Evaluate Implementation-Based Healthcare Efforts

Henry D Anaya1-3*

1Veterans Affairs (VA) Quality Enhancement Research Initiative for HIV and Hepatitis (QUERI-HIV/HEP) and Center for the Study of Healthcare Provider Behavior, VA Greater Los Angeles Health Services Research and Development (HSR&D) Center of Excellence, VA Greater Los Angeles Healthcare System, Los Angeles Ca, USA

2UCLA David Geffen School of Medicine, Division of General Internal Medicine; Los Angeles, CA, USA

3Center for the Management of Complex Chronic Conditions (CM3) and QUERI Spinal Cord Injury (SCI), Chicago, IL, USA

*Corresponding Author:
Henry D Anaya
PhD (Contact for article reprints)
UCLA David Geffen School of Medicine
Division of General Internal Medicine
1301 Wilshire Blvd. 111G Los Angeles CA 90073, USA
Tel: (310) 478-3711 X-48488
Fax: (310) 268-4933
E-mail: [email protected]

Received date: March 24, 2015; Accepted date: May 15, 2015; Published date: May 26, 2015

Citation: Anaya HD (2015) The Implementation Fidelity Tracker: Development and Dissemination of an Audit-Feedback Tool to Evaluate Implementation-Based Healthcare Efforts. J AIDS Clin Res 6:462. doi:10.4172/2155-6113.1000462

Copyright: © 2015 Anaya HD. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Visit for more related articles at Journal of AIDS & Clinical Research

Abstract

Evaluation of implementation activities in real-time allows for both tailoring of the intervention to allow for the best chance at success. Evaluation also acts as an effective audit-feedback mechanism to highlight barriers and facilitators of the implementation to field staff and key stakeholders, as well as a measure of fidelity to the implementation effort itself. The development and use of an implementation fidelity tracker is discussed. This type of implementation tool has widespread implications for evaluation of specific activities pertaining to implementation efforts. Its simplicity and versatility allow for use in a variety of domains.

Keywords

Evaluation; Implementation; Fidelity measure

Background

The successful implementation of any initiative is the culmination of a series of smaller, progressive steps toward a goal. The ability, therefore, to evaluate a series of more discrete steps which encompass an overall implementation plan in real-time would be an integral asset regarding the accomplishment of the intervention in question. The process of developing and disseminating a healthcare quality improvement tool of this sort is the focus of this paper.

Quality improvement

The International Organization for Standardization (ISO) has defined quality improvement as the actions taken throughout an organization to increase the effectiveness of activities and processes to provide added benefits to both the organization and its customers [1]. Simply, quality improvement is anything which causes a beneficial change in performance. Healthcare quality improvement, then, are activities that cause beneficial changes in healthcare performance at either the organizational level (e.g., through policy changes) or at the staff level (e.g., improvements in workflow).

Audit-feedback

In regard to quality improvement, audit-feedback refers to the process by which information is generated and conveyed back to a study team or research group, so that they can use this information to adjust accordingly [2,3]. The notion behind audit-feedback is that the research team will periodically feed back project-specific information to those charged with implementing a given initiative. The research team will have the ability to determine whether or not something is being implemented as intended, whether the policy changes being requested happened in a timely manner, etc. By the use of audit-feedback, the study team can review the progress to date, and make adjustments accordingly. Audit and feedback generally leads to small but potentially important improvements in professional practice. The effectiveness of audit and feedback depends on baseline performance and how the feedback is provided [2] (Figure 1).

aids-clinical-research-visualization

Figure 1: Visualization of Audit-Feedback loop.

United States department of veterans affairs healthcare system

The United States has a comprehensive system of healthcare for Veterans. The United States Department of Veterans Affairs healthcare system (VHA) has grown from 54 hospitals in 1930, to include 171 medical centers nationwide, with more than 350 outpatient, community, and outreach clinics, 126 nursing home care units, and 35 domiciliaries.

VA QUERI-HIV-Hepatitis

In 1998, the VA created the Quality Enhancement Research Initiative (QUERI), in an attempt to overcome the long delays in integrating research evidence into routine practice. Ten QUERI groups each focus on a different disease or condition selected because of high prevalence or high burden among Veterans, their families, and the VA health care system [4]. The official mission of the HIV-Hepatitis QUERI is to make evidence-based HIV care more accessible, optimize the application of evidence-based HIV therapies, and improve the delivery of collaborative and comprehensive treatment of co-morbid conditions in order to ensure better health for Veterans who live with HIV.

Implementation of HIV rapid testing in VA primary care clinics: The development and use of an implementation tracker tool

As part of our efforts to expand HIV testing within the VA healthcare system, we recently completed a multi-year routine HIV rapid testing effort at two VA primary care clinics with known, high HIV seroprevalence among their respective patient populations. We chose the VA as a model for integrated systems more generally, and because the electronic medical record facilitated evaluation. Moreover, previous studies have shown HIV positivity rates in VA samples to exceed those of the general medical population [5].

Our challenge regarding implementation was how best to integrate routine HIV testing into primary care, due primarily to testing rates being low in these settings [6,7]. The purpose of the study was to evaluate a wider implementation in two PC clinics and to assess implementation facilitators, barriers and overall success. One of the main evaluation tools employed was what we term the Implementation Tracker.

The implementation tracker

An integral part of any quality improvement and/or implementation effort is the ability to adequately gauge the fidelity to said effort. As part of our activities with the VA QUERI HIV-Hepatitis, we devised and utilized just such an audit-feedback tool, which we termed the Implementation Tracker. For our purposes, the implementation tracker was employed as to assess the process of implementation regarding the implementation of various HIV testing efforts throughout the VA healthcare system.

As part of our numerous implementation efforts regarding HIV testing in VA, we are aware that ‘implementation’ (as defined by our previous experience), will necessarily be different at each site, depending on local conditions. It is incumbent on the study/implementation team to assess implementation fidelity in as general a way as feasible. Therefore, when conceptualizing this diagnostic tool, we devised a simple yet effective approach which would allow for the evaluation in real time of implementation based efforts.

Our conceptualization, therefore, consisted of a likert-scale measure with three domains:

• Fully implemented

• Moderately implemented

• Not implemented

Although the content of the tracker will necessarily be different depending on the intervention and the outcomes to be evaluated, for our intervention the tracker tool evaluated some of the following measures pertaining to launching HIV testing:

• Convening of local leaders/staff;

• Evaluation of local HIV policies;

• Local staff engagement with implementation plan;

• Consistency of local HIV policy with our HIV testing model;

• Effectiveness of Champion/Change Agent role.

We scored each element for evidence of the full, partial or non-implementation of each element. Elements were scored based on duties associated with the completion of the aforementioned measures (e.g., has local staff been identified and initial briefing meetings convened? Are project nurses offering RT on a routine basis, on a partial basis, or not at all?). These questions were answered by monitoring of duties by either project staff or our site champions (Table 1).

Implementation Marker Likert-scale implementation measure
  Fully implemented Moderately implemented Not implemented
Convening Local Leaders
Introductory project call with PI/local stakeholders to assess barriers, coordinate meeting (s) between NPS and local stakeholders;
meeting (s) with  local nurse manager to brief on project aims;
meeting (s) with local chief of ID to brief on project aims;
meeting (s) with local chief of laboratory service to brief on project aims;
     
Nurse engagement
In-person meeting with local nurses to brief on project aims;
Participating nurses identified and trained on NRT procedures;
Quarterly audit/feedback to managers/providers
     
  Fully implemented Moderately implemented Not implemented
Local HIV Policy Issues
HIV policy changed/revised to allow  local nurses to
administer HIV rapid tests
Costs of HIV rapid tests absorbed by Lab
rapid tests readily available for use by nurses
Consent forms available
     
IRM Support
Initial calls to local IRM chief to brief on project aims
Distribution of HIV testing template software
HIV template mapped, loaded, activated    
     
  Fully implemented Moderately implemented Not implemented
Effectiveness of Champion/change agent Role
Ability to convene Introductory project call with PI/local
stakeholders to assess barriers, coordinate meeting (s) between NPS and local stakeholders;
Ability to convene In-person meeting (s) with  local nurse manager to brief on project aims;
Ability to convene meeting (s) with local chief of ID to brief on project aims;
Ability to convene meeting (s) with local chief of laboratory service to brief on project aims;
     

Table 1: The Implementation Tracker.

This allowed for project staff to be flexible to any changes in implementation strategy (e.g., reinforcement trainings, in-services, etc.), that may need to occur based on initial findings of partial or nonimplementation efforts gleaned by our tracker tool.

This type of audit-feedback evaluation is intended to gauge how implementation is proceeding, so that barriers are identified early and staff can work toward resolution. The overall focus of the use of the tracker tool is the extent to which there is fidelity to the implementation plan.

The design of the study was a pre-post quasi experiment. We chose two study sites in regions with high HIV seroprevalence and with similar annual unique patient visits. Both sites were located at large urban VA hospitals, one in the Northeast and one in the Southwest. Sites were provided with identical implantation packages, but were encouraged to adapt that package to their local needs.

As part of our initial formative efforts preparing for implementation of our program, we employed formative key informant surveys to ascertain barriers and facilitators to implementation and sustainability of HIV testing. Using data obtained from these surveys of staff and facility management, we were able to derive salient elements to populate our implementation tracker (Table 1), which was then used as an audit-feedback mechanism for both research and local staff to gauge implementation fidelity [8]. In instances where our elements were scoring low on implementation fidelity, we were able, as intended, to make the necessary adjustments (in almost real-time) to increase the likelihood of a successful undertaking.

Implementation of a nurse-initiated rapid HIV testing initiative resulted in significant increases in the number of PC patients receiving HIV testing, thereby contributing to the VA’s initiative to increase routine HIV testing for all Veterans [8]. In addition, at site 1, we identified 5 previously undetected HIV-positive Veterans during our study period. At site 2, we identified 9 HIV-positive Veterans during the study period.

Conclusion

The development and use of a simple-to-use tool to evaluate fidelity to an implementation effort is critical, both to the evaluation of that effort, as well as to use the data obtained to revise activities accordingly, to ensure the best chance at a successful outcome.

In conceptualizing the use and revision of this tracker tool for your specific implementation-based purposes, investigators and staff should strongly consider their choice of elements by identifying salient concepts identified as part of a series of formative key informant interviews with staff and/or management prior to the commencement of any implementation effort.

We have developed and employed this tracker tool successfully at the both beginning and throughout a variety of HIV testing interventions to assess implementation fidelity to our HIV testing package [8,9].

The specific HIV testing initiative highlighted in the case study was indeed sustained by both study sites and has now become the standard of care at both facilities. The success of this HIV testing campaign was in no small measure, based on the ability for project staff to evaluate the fidelity to the implementation effort, by the use of the implementation tracker elements and tool.

Finally, future studies which focus on audit and feedback as one method of evaluating implementation efforts should directly compare different methods of providing feedback to identify the most appropriate methods for conveying information back to project and implementation staff.

Acknowledgements

The authors would like to thank Maria Barradas-Rodriguez MD, Virginia Kan, MD and Herschel Knapp, PhD for their assistance on this project. This research reported here was funded by VA Quality Enhancement Research Initiative (QUERI) Service Directed Project grant awarded to the author and supported by the VA, VHA, and HSR&D. The views and opinions expressed in this article are those of the author and do not necessarily represent the views of the US Department of Veterans Affairs. The VHA supported this study but had no input in the design or reporting, or decision to submit this paper for publication. The study from which this project originated was reviewed and sanctioned by a US Department of Veterans Affairs Internal Review Board (IRB) process.

Funding Support

This research was funded by VA Quality Enhancement Research Initiative (QUERI) grant SDP 07-318 awarded to the first author and supported by the Department of Veterans Affairs, Veterans Health Administration (VHA), Health Services Research and Development Service (HSR&D).

References

Select your language of interest to view the total content in your interested language
Post your comment

Share This Article

Recommended Conferences

  • 6th World Congress on Control and Prevention of HIV/AIDS , STDs & STIs
    August 27-29, 2018 Zurich, Switzerland
  • 6th International Conference on HIV/AIDS , STDs and STIs
    October 29-30, 2018 San Francisco, USA

Article Usage

  • Total views: 11694
  • [From(publication date):
    May-2015 - Jun 19, 2018]
  • Breakdown by view type
  • HTML page views : 7910
  • PDF downloads : 3784
 

Post your comment

captcha   Reload  Can't read the image? click here to refresh

Peer Reviewed Journals
 
Make the best use of Scientific Research and information from our 700 + peer reviewed, Open Access Journals
International Conferences 2018-19
 
Meet Inspiring Speakers and Experts at our 3000+ Global Annual Meetings

Contact Us

Agri & Aquaculture Journals

Dr. Krish

[email protected]

+1-702-714-7001Extn: 9040

Biochemistry Journals

Datta A

[email protected]

1-702-714-7001Extn: 9037

Business & Management Journals

Ronald

[email protected]

1-702-714-7001Extn: 9042

Chemistry Journals

Gabriel Shaw

[email protected]

1-702-714-7001Extn: 9040

Clinical Journals

Datta A

[email protected]

1-702-714-7001Extn: 9037

Engineering Journals

James Franklin

[email protected]

1-702-714-7001Extn: 9042

Food & Nutrition Journals

Katie Wilson

[email protected]

1-702-714-7001Extn: 9042

General Science

Andrea Jason

[email protected]

1-702-714-7001Extn: 9043

Genetics & Molecular Biology Journals

Anna Melissa

[email protected]

1-702-714-7001Extn: 9006

Immunology & Microbiology Journals

David Gorantl

[email protected]

1-702-714-7001Extn: 9014

Materials Science Journals

Rachle Green

[email protected]

1-702-714-7001Extn: 9039

Nursing & Health Care Journals

Stephanie Skinner

[email protected]

1-702-714-7001Extn: 9039

Medical Journals

Nimmi Anna

[email protected]

1-702-714-7001Extn: 9038

Neuroscience & Psychology Journals

Nathan T

[email protected]

1-702-714-7001Extn: 9041

Pharmaceutical Sciences Journals

Ann Jose

[email protected]

1-702-714-7001Extn: 9007

Social & Political Science Journals

Steve Harry

[email protected]

1-702-714-7001Extn: 9042

 
© 2008- 2018 OMICS International - Open Access Publisher. Best viewed in Mozilla Firefox | Google Chrome | Above IE 7.0 version
Leave Your Message 24x7