Your new experience awaits. Try the new design now and help us make it even better

CORRECTION article

Front. Health Serv.

Sec. Implementation Science

Volume 5 - 2025 | doi: 10.3389/frhs.2025.1724907

The guided understanding of implementation, development & education (GUIDE): a tool for implementation science instruction

Provisionally accepted
  • Frontiers Media SA, Lausanne, Switzerland

The final, formatted version of the article will be published soon.

The use of implementation science (IS) continues to increase, creating a need for the development and deployment of practical teaching tools for widespread use. Ideally, such tools would be simple, a concept championed by Geoffrey Curran in his description of IS centered on "the thing" (1). The subsequent rapid adoption of this plain language terminology and explanation demonstrates the value of accessible language for learners and others who are new to the field. As instructors, we the authors educate and consult with trainees and investigators who want to integrate IS methods into their research. However, even scientists with robust training in biomedical and health services research struggle to apply IS to a given evidencepractice gap. Additionally, as implementation scientists, we partner with colleagues and scholars new to IS and must orient new staff to our projects. In such partnerships, we find ourselves teaching fundamental IS concepts to our teams to facilitate team-based inquiry.In recognition of the difficulty we and others encounter in guiding new users to articulate "the thing" (including core and adaptable components) and the evidence-practice gap or problem, we developed and refined the Guided Understanding of Implementation, Development & Education (GUIDE) Tool. This tool combines and builds on implementation mapping (2) and the Implementation Research Logic Model (IRLM) (3), which are useful ways to systematically organize IS information and to guide the articulation of implementation strategies (using ERIC or Behaviour Change Techniques) and relevant outcomes (using the Proctor Outcomes Framework). The resulting GUIDE may help researchers and trainees new to IS to align inquiry with an IS lens and supports understanding of key IS aspects of IS.The approach of taking an existing implementation tool designed for research and evaluation and adapting and simplifying it to support learners to IS was used before with Getting to Implementation (4) and Getting to Implementation-Teach (5). In the same way, we do not seek to strip any of the power and contributions of any individual existing implementation science resources and tools (e.g., implementation mapping, the IRLM, ERIC strategies, Proctor's outcomes, etc.). Instead, we seek to build on its foundation and popularity to introduce a new group of scientists to IS. The reasons trainees and investigators come to IS are heterogenous. We designed the GUIDE Tool to be responsive to these needs (see Figure 1). Supplementary Additional File 1 contains an editable version of the GUIDE Tool. Representative examples from our work include:• A surgeon has an existing line of research and has been advisedto add IS to their portfolio. • A nurse practitioner has identified a practice of interest with limited evidence and seeks to identify the evidence-practice gap and test the effectiveness and implementation of said practice. • A psychologist has an existing research question and study design and is considering finding a supplemental way to add IS approaches. Given this context, we include a research question box in the top left corner of the GUIDE Tool to allow users to anchor their learning and completion of the model within the framing of their existing research questions. There is not a preferred or linear ow to completing each of the sections of the GUIDE with learners' own knowledge or topic area. Instead, each section represents a key aspect of implementation inquiry to be seriously considered by the user. Learners are introduced to each aspect of IS over time (see Using the GUIDE to teach the key aspects of IS). As knowledge increases, we encourage trainees to complete what portions they can with existing knowledge. We recommend the GUIDE Tool as an educational tool for teaching scholars and trainees new to IS about the key aspects of implementation research inquiry and evaluation over the course a semester or workshop session (see Using the GUIDE Tool to teach the key aspects of IS for more details). In conjunction with use as an educational tool, we have also found it helpful to use the GUIDE Tool to help scholars and trainees to organize existing information that they already to begin to think about designing an implementation study. In our experience, when novice IS users understand the key aspects of IS, they can begin to use IS evaluation tools as they were originally designed including, determinant frameworks [outlined by Nilsen (6)], change objectives (2), implementation strategies (7)(8)(9), mechanisms (10), and outcomes (6,11,12). The goal of the evidence-practice gap is to clearly identify the social problem of interest and the corresponding evidence-practice gap. Some questions to help articulate this gap include: What populations are impacted by this gap? In what places (geographic) or service settings does this gap exist? How do people experiencing this gap think about it? We recognize that evidence-based practice, innovation, practice of interest or "the thing" have many synonyms and some synonyms have epistemological valences which go beyond the scope of this paper. That said, concretely identifying and specifying the "the thing" is critically important to implementation science. This includes identifying the level of evidence required for the specific practice to be considered "evidence-based." We recognize that some fields require multiple large randomized controlled trials (e.g., pharmaceutical therapeutics) and in other fields innovations may be considered evidence-based anchored in less rigorous trial design due to ethical concerns (e.g., innovations targeting children or pregnant people).Within the evidence-based practice, we prompt scholars to consider core and adaptable components. This practice helps to identify possible areas for scientific inquiry and recognizes that implementation in the real world requires compromise. We often recommend Figure 6 (p. 58)foot_2 from the Implementation Facilitation Training Manual as an excellent resource for planning (13). We recognize that core and adaptable components of the evidence-based practice are often not specified in the existing literature. This introduces concerns not just because it can create complexity in implementing with fidelity, but also in that the core components of an EBP are often those associated with identifying the mechanisms of change. The core components of an evidence-based practice are the defining characteristics of the innovation without which the innovation would not exist. A common question we ask is, "what has to happen in order for you to consider 'the thing' to be 'the thing'?" The adaptable components of an evidence-based practice are aspects that can be changed in small ways (e.g., in person or via telemedicine) or be skipped altogether. Scholars can offer tailoring and exibility to implementers by prospectively identifying aspects of the innovation to be adapted (and save themselves headaches when "life" happens). Determinants are factors that get in the way or support the ability of an individual, group, organization, or community to do "the thing." Determinant frameworks are lists of potential determinants or constructs often organized into domains (6). Common determinant frameworks are the updated Consolidated Framework for Implementation Research (14) and the Tailored Implementation for Chronic Diseases (TICD) checklist (15). Historically, there has been a binary view of determinants in that they either barriers or facilitators. However, we take a valence agnostic approach in that over time and in evolving situations, determinants can act as both barriers and facilitators for the implementation of an innovation. Change objectives were first introduced by Fernandez and colleagues as part of implementation mapping (2). We view change objectives as the incremental steps between determinants and implementation strategies and provide transparency for why a given implementation strategy was selected and allude to potential mechanisms of change. The addition of change objectives to the original IRLM seeks to strengthen the important link between determinants and implementation strategies. Implementation strategies are "the stuff" we do to help people, places, groups, organizations, and communities do "the thing" (1). The most common taxonomy of 73 implementation strategies is the Expert Recommendations for Implementing Change (ERIC) and their nine clusters (7,8). We find ERIC to be a helpful starting point for identifying potential implementation strategies and then encourage clear specification of the implementation strategy to promote clarity and transparency (16). Other compendia of implementation strategies, such as the Behaviour Change Techniques list from the Behaviour Change Wheel, could also be used here (9). Mechanisms or mechanisms of change are the process through which an implementation strategy affects the targeted outcome (10). Mechanisms are a recent addition to the field of IS with diverse perspectives on their usefulness. Therefore, we encourage trainees to consider potential mechanisms of change, recognizing that implementation mechanisms are an emerging topic in the field. Outcomes include implementation outcomes in addition to service and patient outcomes (11,17). Evaluation frameworks such as the Reach, Effectiveness, Adoption, Implementation and Maintenance framework (12,18) and the Proctor framework (11) provide useful taxonomies for guiding the identification, specification, and measurement of implementation outcomes. We recognize the importance of implementation, process, and effectiveness outcomes and include distinct boxes within the Outcomes section for operationalizing each outcome of interest. The social ecological model, first introduced by Urie Bronfenbrenner, identifies levels at which an activity may occur and recognizes the interaction across these levels (19) often visualized by concentric circles. The individual, microsystem, mesosystem, exosystem, and macrosystem can be tailored to an implementation context (e.g., patient, provider team, unit, hospital, health system, policy context). We have found that by identifying the social ecological model level across the GUIDE, especially for determinants, change objectives, and strategies, can allow for greater focus in evaluation. For example, when considering the embedded example of chronic pain management (see below), the change objective of "Demonstrate how to measure and diagnose chronic pain in primary care" occurs at both the individual and clinic level. Therefore, implementation strategies to accomplish this behavior should target individual clinicians and the clinic as a whole. We encourage users to print a copy of the GUIDE Tool or use Microsoft PowerPoint to complete the GUIDE with all the information they know from preliminary data collection, the peer-reviewed literature, and community knowledge. The completed GUIDE Tool may provide visual cues for existing gaps in knowledge and potential targets for future evaluation (3,20) (see Using the GUIDE to plan an IS study for novice IS users). To further support these goals, we developed a complementary worksheet (see Supplementary Additional File 2) which includes prompts for each corresponding section of the GUIDE help those new to IS. We used the GUIDE as the organizing structure for the Foundations of Implementation Science course, the first introduction to IS as part of the IS Certificate program at the University of Pennsylvania. The goal of this course is to introduce trainees and scholars new to implementation science to the key aspects of the field and provide examples of how they can apply IS principles to their own scholarship.The original course was developed and taught by the senior author (MLF) in Fall 2023. In Fall 2024, the authors (LEA and MLF) co-taught the course and aligned the content with each aspect of the GUIDE Tool (see Figure 2). The first author (LEA) taught a workshop for Doctor of Social Work students, several guest lectures, and is again teaching the Foundations course in Fall 2025 using the GUIDE in an organizing capacity and teaching tool. We further operationalize how each component of IS represented in the GUIDE fits into learning through course objectives listed in Table 1.Throughout the course, the GUIDE functions as a roadmap for learners to identify how what they are learning fits within the greater picture of IS. For example, in Classes 6, 7, and 8 we discuss implementation and dissemination strategies. By introducing the concept later in the semester, we work to support learners' understand the importance of how implementation strategies connect to both change objectives and implementation determinants. Using the GUIDE, learners may visualize how these strategies connect to mechanisms and implementation outcomes. Example of GUIDE for teaching. This is a slide used throughout the semester of Foundations of Implementation Science course to help orient the learner to how the current session fits into the larger picture of IS. Learning objectives Not all class sessions are represented by the GUIDE as a typical semester has 12-15 weeks. This allows us to be responsive to the needs of our learners and incorporate other important topics in implementation science such as implementation mapping (Class 10), community engagement in non-health fields (Class 11), best practices in reporting IS studies (Class 13); and a "Choose your own adventure" session where we discuss IS topics of interest that arise throughout the semester (Class 14). As previously mentioned, some IS trainees come to the field with existing research or a study in mind without designing specifically for IS. Here, we introduce a case example of how the GUIDE can be applied based on a previous qualitative study conducted by the first author (LEA) while they were a PhDstudent and a novice IS user. The study examined the determinants of dissemination and implementation of evidencebased chronic pain management in primary care (21). Brie y, interviews were conducted with primary care providers across multiple health systems to better understand factors that impact their ability to learn about (dissemination) and use (implementation) evidence-based chronic pain management, primarily focusing on non-pharmacologic approaches.As shown in Figure 3 (below), we used the study results to complete the GUIDE and identified gaps for future work. For example, starting with EBP core components, we recognized that while PCPs gave some examples of evidence-based chronic pain management, they did not fully articulate (or did not know) what duration, dosage, or specific aspects are required for physical therapy and/or cognitive behavioral therapy to meet the needs of people living with chronic pain. This highlights a larger gap in the field of innovation development that impacts implementation science. Additionally, the interviews did not discuss potential mechanisms of change to support a potential causal pathway. Hypothetically, if there existed a body of literature which has already explored mechanisms of change in this setting, we could add this to the GUIDE.We identified at what level of the social ecological model each of the determinants and change objective would act upon. For example, the determinant of the degree to which clinic rurality and the clinic population composition impacts the ability to implement evidence-based chronic pain management is a community-level factor. In contrast, the change objective to identify alternative sources of funding for co-pays or treatment payment occurs both at the clinic level and at the health system level (depending on the organizational structure).This example shows how the GUIDE may help to organize existing knowledge about an evidence-practice gap into the language of IS and can identify key gaps in knowledge for future inquiry. Example of GUIDE use in chronic pain management. The Figure provides an example of how the GUIDE can be applied to an existing research project for a scholar new to implementation science. Each section was completed using information from a qualitative research study in response to the prompts. For this example, the social ecological model was delineated as the individual, clinic, health system, and community. We aligned both determinants and change objectives with the social ecological model to help guide future data collection and analysis. Implementation science seeks to close the evidence-practice gap between what is currently happening and what we want to happen in an ideal world. We developed the IRLM using the existing knowledge base from implementation mapping, the IRLM, and other resources including ERIC and the Proctor outcomes framework to develop a teaching tool for learners to IS. We hold that the innumerable lessons and evidence already learned in the field should be accessible to both scholars who spend most of their professional lives thinking about implementation science and to community providers who are looking to address a problem they see.

Keywords: implementation science, Education, implementation research logic model, pedagogy, implementation mapping

Received: 14 Oct 2025; Accepted: 17 Oct 2025.

Copyright: © 2025 Production Office. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

* Correspondence: Frontiers Production Office, production.office@frontiersin.org

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.