Your new experience awaits. Try the new design now and help us make it even better

ORIGINAL RESEARCH article

Front. Health Serv., 26 September 2025

Sec. Implementation Science

Volume 5 - 2025 | https://doi.org/10.3389/frhs.2025.1613925

Documenting adaptations across the Accelerating Colorectal Cancer Screening and follow-up through Implementation Science research programs: methods and adaptation examples


Borsika A. Rabin,
Borsika A. Rabin1,2*Erin S. Kenzie,Erin S. Kenzie3,4Jill M. OliveriJill M. Oliveri5Aaron J. Kruse-DiehrAaron J. Kruse-Diehr6Sonja HooverSonja Hoover7Usha MenonUsha Menon8Mark P. DoescherMark P. Doescher9Prajakta Adsul,Prajakta Adsul10,11Shiraz I. MishraShiraz I. Mishra12Kevin EnglishKevin English13Jesse NodoraJesse Nodora14Helen LamHelen Lam15Karen KimKaren Kim15Jennifer K. CouryJennifer K. Coury3Melinda M. Davis,,Melinda M. Davis3,4,16Teri MaloTeri Malo17Sarah KobrinSarah Kobrin18Sujha SubramanianSujha Subramanian7Rene M. Ferrari
Renée M. Ferrari17
  • 1Herbert Wertheim School of Public Health and Human Longevity Science, University of California San Diego, La Jolla, CA, United States
  • 2UC San Diego Altman Clinical and Translational Research Institute Dissemination and Implementation Science Center, University of California San Diego, La Jolla, CA, United States
  • 3Oregon Rural Practice-based Research Network, Oregon Health & Science University, Portland, OR, United States
  • 4OHSU-PSU School of Public Health, Oregon Health & Science University, Portland, OR, United States
  • 5Department of Population Sciences and Community Outreach, The Ohio State University Comprehensive Cancer Center, Columbus, OH, United States
  • 6Department of Family and Community Medicine, University of Kentucky College of Medicine, Lexington, KY, United States
  • 7Implenomics, Dover, DE, United States
  • 8The University of South Florida and Tampa General Hospital Cancer Institute, Tampa, FL, United States
  • 9Department of Family and Preventive Medicine and Stephenson Cancer Center, University of Oklahoma Health Sciences Center, Oklahoma City, OK, United States
  • 10University of New Mexico Comprehensive Cancer Center, University of New Mexico Health Sciences Center, Albuquerque, NM, United States
  • 11Department of Internal Medicine, University of New Mexico Health Sciences Center, Albuquerque, NM, United States
  • 12University of New Mexico Comprehensive Cancer Center and Department of Pediatrics, University of New Mexico Health Sciences Center, Albuquerque, NM, United States
  • 13Albuquerque Area Southwest Tribal Epidemiology Center, Albuquerque Area Indian Health Board, Inc., Albuquerque, NM, United States
  • 14UC San Diego Health Moores Cancer Center, University of California San Diego, La Jolla, CA, United States
  • 15Center for Asian Health Equity, University of Chicago, Chicago, IL, United States
  • 16Department of Family Medicine, Oregon Health & Science University, Portland, OR, United States
  • 17Lineberger Comprehensive Cancer Center, The University of North Carolina at Chapel Hill, Chapel Hill, NC, United States
  • 18Healthcare Delivery Research, National Cancer Institute, Bethesda, MD, United States

Introduction: Adaptations are common, expected, and often imperative for successful uptake and sustained implementation of clinical or public health programs in real-world practice settings. Understanding which adaptations have been made to evidence-based interventions and subsequent implementation strategies throughout the life cycle of a project can contextualize findings and support future scale-up of the program. Systematic documentation of adaptations is rarely conducted or reported, and little guidance exists on approaches to documenting adaptations.

Methods: Accelerating Colorectal Cancer Screening and follow-up through Implementation Science (ACCSIS) is a National Cancer Institute-funded Beau Biden Cancer MoonshotSM Initiative developed to improve colorectal cancer screening, follow-up, and referral for care among underserved groups, including diverse racial and ethnic populations and people living in rural areas. Using an iterative data gathering approach—a survey, data abstraction, and data validation—we compiled information about adaptation documentation and analytic methods and intervention and implementation strategy adaptations from the eight funded ACCSIS research programs. An analytic team representing multiple ACCSIS programs reviewed, coded, and summarized the data using a rapid qualitative analytic approach.

Results: ACCSIS programs varied substantially in how they defined and documented adaptations. Nine approaches were used to document adaptations; the most common were periodic reflections and review of meeting minutes and agendas. Nine analytic methods were reported to guide adaptation analysis; the most frequently mentioned were rapid qualitative methods, descriptive statistics, and mixed-methods analysis. A total of 96 adaptations were reported by the eight research programs, most of which occurred during the pre-implementation stage (68%) or were made to the program format (71%). Only 36% of the adaptations were due to the COVID-19 pandemic.

Conclusions: Our multi-method, systematic approach allowed us to explore how sites document and analyze adaptations across eight ACCSIS Moonshot programs. Using a systematic approach allowed for comparisons of intervention and strategy adaptations within and across research programs and can inform the science of adaptations, while building a knowledge base of why such adaptations are needed and how they can inform implementation efforts across time. Methods described herein provide a template for similar assessment activities in other large, multi-site research initiatives.

1 Introduction

Adaptations—defined as changes to an intervention or implementation strategy to increase fit to the context—are common, expected, and often necessary for the successful uptake and initial, ongoing, and sustained implementation of a program in a real-world setting (1). Understanding what adaptations have been made to both evidence-based interventions and implementation strategies throughout the life cycle of a study can help with the interpretation of findings, inform needed refinements, and support future scaling of the program in different settings (24). Systematic documentation of adaptations has not been conducted broadly and is a great need for the field of implementation science (5, 6). Furthermore, optimal approaches to documenting adaptations in complex studies and/or across multi-site initiatives and how to assess the impact of these adaptations on key implementation and effectiveness outcomes are still not well understood (7, 8).

Although comprehensive frameworks exist to guide the process of adaptation in the pre-implementation phase (9, 10) and to provide a nomenclature of the type of adaptations to interventions and implementation strategies (11, 12), guidance is lacking on how to collect and analyze data about adaptations in the context of research studies. There is a growing consensus that more than one method should be used to document adaptations; however, it is unclear what combination of methods for documenting adaptations yields the most meaningful information (13). There is even less guidance on how to assess the impact of adaptations on diverse implementation outcomes.

Finally, using systematic approaches to document adaptations across multiple research programs (RPs) longitudinally would allow for comparison of intervention and strategy adaptations both within each RP and across RPs over time. This approach can also provide a template for adaptation documentation and assessment activities in other large research initiatives.

To address these needs, we aimed to describe (1) the process of developing a cross-RP documentation system for a large consortium, (2) the methods used for tracking intervention and strategy adaptations across RPs, (3) barriers and facilitators to implementing those methods, and (4) potential value to advance implementation science. We review and summarize adaptations made in pre-, early-, and mid-implementation phases and discuss next steps to refine the tracking system and process moving forward.

2 Methods

2.1 The ACCSIS consortium

This study was conducted as part of the National Cancer Institute-funded Beau Biden Cancer MoonshotSM Initiative (14). The overall aim (15) of ACCSIS is to conduct multisite, coordinated, transdisciplinary research to evaluate and improve colorectal cancer (CRC) screening processes using implementation science.

NCI began funding ACCSIS RP in 2018 to improve CRC screening, follow-up, and referral for care among populations that have low CRC screening rates (16), with a focus on underserved groups including diverse racial and ethnic populations and people living in rural areas. The initiative comprises a multisite consortium that includes a Coordinating Center (Research Triangle Institute—RTI) and eight RPs located across the United States and working with diverse populations including American Indian, other racial and ethnic minorities, rural populations, and people who are uninsured. Additional File S1 provides an overview of the key objectives, geographical locations, and populations served for the eight ACCSIS RPs.

2.2 Origins of adaptation documentation in ACCSIS

The objectives and methods for the work presented in this manuscript emerged from a series of Intervention, Strategies, and Adaptations Workgroup meetings conducted between summer 2020 and fall 2021 involving representatives from all eight ACCSIS RPs. Within the larger ACCSIS initiative, the workgroup's purpose is to contribute to the development of effective and sustainable multilevel interventions and strategies across all RPs. The workgroup identified documentation of adaptations as a shared interest and engaged in a series of consensus discussions, reviewed the literature, and invited experts in the field to present about their experience in documenting adaptations. Although there was initial variation in the level of interest and expertise across the RPs regarding documenting and analyzing adaptations, all RPs ultimately agreed to document adaptations. Data collection about adaptations and documentation methodology was conducted cross-sectionally starting in fall 2021.

2.3 Data collection and analysis for methods to document and analyze adaptations

To understand how individual RPs defined adaptations, collected information about adaptations, and intended to analyze the adaptations, we developed an instrument based on prior publications by some of the authors of this manuscript (BAR, ESK, JKC, MMD) along with input from all members of a smaller writing group representing four of the RPs (California, Kentucky, North Carolina, Oregon), NCI, and RTI. The final version of the data collection instrument is available as Additional File S2 and includes 13 questions about how adaptations were defined; who decided about what changes to the intervention, implementation strategies, and RP were considered adaptations; theoretical frameworks guiding adaptation documentation and analysis; processes and methods used to document adaptations (i.e., how, who, from whom, using what type of data, how frequently, what type of information); and methods planned for analysis of adaptations. Data from the surveys were summarized by RP and across programs using descriptive statistics (frequencies and averages).

2.4 Documenting and reviewing adaptations

To operationalize our adaptation documentation approach, we asked each RP to share a list of adaptations made to their intervention or implementation strategies, from program inception until a cut-off date selected (i.e., all studies were in the pre- to mid-implementation phase at the time of the request) by the working group for the purpose of bounding data collection. The intention was not to collect a complete list of adaptations but instead to assess the feasibility of collating adaptations from the eight RPs. Throughout the manuscript, we refer to these sample adaptations simply as “adaptations” for clarity and ease of reporting. Adaptations were classified as occurring during the pre-implementation, early-implementation, or mid-implementation phases by each program. Pre-implementation indicated an adaptation that happened prior to the delivery of the intervention (e.g., prior to patient list pulled by care coordinator). Early implementation was classified as during early implementation activities (e.g., post identification of eligible patient list but prior to mailing the test kits), while mid-implementation was defined as during the more advanced implementation of the program (i.e., post mailing the test kits).

Adaptation data were collected from RPs and then collated and organized in an Excel database. Each RP sent the following information about their adaptations: (1) what the adaptations were, (2) when they were made, (3) the reasons for the adaptations, (4) what component(s) of the program were adapted, and 5) whether the adaptations were related to the COVID-19 pandemic. The collated adaptations from all RPs were then reviewed for consistency, and additional coding was undertaken by a small analytic team (BAR, ESK, JMO, AK-D, RMF) to identify sub-categories for adaptation description and why the adaptation was made. The workgroup coded the adaptation data using FRAME: an expanded framework for reporting adaptations and modifications to evidence-based interventions (11). We decided to use a subset of the FRAME constructs instead of the full set of constructs for our combined database to make the documentation process more pragmatic and accessible to all participating RPs. We used a team-based approach to coding where at least two coders reviewed each adaptation entry and reached consensus using a shared codebook regarding the appropriate category. Instead of calculating inter-coder agreement, we used a consensus approach to resolve disagreement in coding. The coded adaptations were then returned to each RP for member-checking (17, 18), and refinements and modifications were incorporated based on feedback from the RPs. Characteristics of adaptations were then summarized using descriptive statistics.

3 Results

Findings are summarized in two sections. First, we summarize information about methods used across the eight RPs to document and analyze adaptations (Tables 1 through 3B). Next, we describe findings from the compilation and coding of the adaptations shared with the workgroup by all RPs (Table 4 and Additional File S3).

Table 1
www.frontiersin.org

Table 1. Adaptation definitions and personnel defining adaptations by ACCSIS research.

3.1 Approaches to defining and capturing adaptations

ACCSIS RPs varied substantially in how they approached capturing adaptations to their intervention and implementation strategies. Some programs used a comprehensive approach to collect a variety of data, using multiple methods guided by a framework, while others collected less comprehensive adaptation information.

Table 1 provides an overview of each RP's definitions of adaptations, the teammates who decided what constituted an adaptation to their program, and what (if any) frameworks were used to guide adaptation documentation and analysis. Adaptation definitions included broad, inclusive definitions such as “Response to circumstances that were not anticipated in pre-implementation planning phase” (ACCSIS Arizona), as well as more specific definitions: “Changes to intervention strategies based on unique population context and settings, as guided by multisector action team which guides the implementation process at each health care facility; includes both COVID and non-COVID changes” (ACCSIS New Mexico). Most definitions included the concept of improving fit to context (setting, population), and multiple definitions included reference to both intervention and implementation strategy. The role of the COVID-19 pandemic was explicitly mentioned in two definitions.

Five of the eight RPs relied on expertise from the research team and either implementation partners or research and implementation partners in deciding what counts as an adaptation; three RPs primarily relied on the research team for these decisions. Over half of the RPs reported the use of one or more frameworks to guide the documentation and analysis of adaptations. Frameworks that were identified included 7-step adaptation framework (19), Framework for Reporting Adaptations and Modifications to Evidence-based Interventions (FRAME) (n = 2) (11), Framework for Reporting Adaptations and Modifications to Evidence-based Implementation Strategies (FRAME-IS) (20), RE-AIM (Reach, Effectiveness, Adoption, Implementation, and Maintenance)-expanded FRAME (13), Integrated Promoting Action on Research Implementation in Health Services (iPARIHS) (21), and the Dynamic Sustainability Framework (DSF) (3).

3.2 Methods for documenting and analyzing adaptations

Table 2 displays the great diversity in the number and type of methods that RPs used for the documentation, analysis, and operationalization of adaptations, stratified by RP. Tables 3A, 3B provide a summary of the Table 2 data, organized by adaptation characteristic (e.g., method, analysis plan, data collection frequency). Specifically, Table 3A summarizes the methods used and planned analyses, and Table 3B summarizes how the RPs operationalized their methods.

Table 2
www.frontiersin.org

Table 2. Key characteristics of adaptation documentation by ACCSIS research program.

Table 3A
www.frontiersin.org

Table 3A. Summary characteristics of method used to document and analyze adaptations in the 8 ACCSIS research programs.

Table 3B
www.frontiersin.org

Table 3B. Operationalization of methods used to document adaptations in the 8 ACCSIS research programs.

Nine unique approaches were used to document adaptations (Table 3A). RPs used an average of three methods to capture adaptations with a range of one to seven methods. The most used approaches to document adaptations were periodic reflections (22) for research team and implementation partners and review of meeting minutes and agendas (five RPs for each). Additional methods included interviews with implementers and adaptation tracking at clinical sites (three RPs for each), and review of clinical contact logs and field notes, informal check-ins with clinical or community partners, and member-checking of data for missing adaptations and accuracy (two for each). One RP also used a real-time adaptation tracking database, and one used reminder emails about adaptations to research team and partners.

Programs reported nine different analytic methods used to guide the analysis of adaptations (Table 3A). Individual RPs reported on average two analytic methods with a range of one to six. Most frequently reported methods included rapid qualitative analysis and descriptive statistics (three RPs for each) followed by mixed-methods analysis, standard (in-depth) qualitative analysis, content analysis, and analyses to examine the impact of adaptations on various outcomes (two for each). One RP also reported three additional innovative methods for analysis: causal-loop modeling (visual representation of interconnected relationships among key variables) (23), member-checking with a local advisory board (verifying findings with community members/research participants), and configurational comparative methods and qualitative comparative analysis (24). Two programs were undecided about methods for analysis.

All documentation methods used the research team to gather documentation information; two methods involved using the implementation partner or research and implementation facilitation partner as well (adaptation tracking at the clinical site, real-time adaptation tracking database). Roles within the research team included implementation specialists and program managers. Most methods allowed for the collection of adaptation information from the research team and implementation partners (i.e., clinical staff and providers at partner health centers). Only two methods were used to gather information from implementation partners exclusively (adaptation tracking at clinical site and informal check-in with clinical or community partners). Most methods involved collecting both qualitative and quantitative data; two involved the collection of only qualitative data (interviews with implementers, reminder email about adaptations to research teams and partners). Programs varied in how they operationalized each method in terms of frequency of data collection and adaptation information collected (Table 3B).

3.3 Characteristics of adaptations

Table 4 provides an overview of adaptations reported by the eight ACCSIS programs. A detailed description of each adaptation is provided in Additional File S3. A total of 96 adaptations were reported by the eight RPs as examples of the types of adaptations implemented during the initial study period. The number of adaptations reported across the RPs varied, with one program reporting as many as 22 adaptations and another reporting only 3.

Table 4
www.frontiersin.org

Table 4. Summary characteristics of the 96 adaptations reported across the 8 ACCSIS research programs.

Most of the adaptations occurred during the pre-implementation stage (68%), followed by early implementation (26%) and mid-implementation (6%). Most adaptations were made to the format or how the intervention is presented (71%), involved the addition of a component (31%), substitution for a component (19%), or otherwise changing the intervention (16%). Most also involved changes to the process for mailing stool-based CRC screening tests to patients (mailed Fecal Immunochemical Test) (43%) or to care coordination/navigation (35%). Adaptations were made to increase the consistency of delivery and fit of the program (implementation) (36%), to enhance the impact or success of the intervention for all or important subgroups (effectiveness) (31%), or to respond to external pressures (31%). Only 36% of the adaptations were attributed to the COVID-19 pandemic.

4 Discussion

To date, we have successfully established a shared vision of tracking adaptations across the ACCSIS initiative, guided by a series of workgroup meetings, review of the literature, and presentations by experts in the field. This vision included both agreement on the need for documenting and reporting adaptations across the consortium and defining adaptations for our purposes. RPs reported methods they used to track adaptations and documented and reported their programs' pre-, early-, and mid-implementation adaptations.

We synthesized adaptation definitions and documentation methods used across the eight RPs and described the significant variation in definitions, key personnel, and data collection and analytic methods used. We also tested the feasibility of collating information about adaptations across the RPs using a harmonized instrument identifying a small set of key characteristics of adaptations.

Adaptations were presented by RPs as complex adaptations that might have included more than one change. Similarly, we also noted variation in the number of reported adaptations across programs, possibly due to the diversity in definitions, that is, some programs documented more nuanced changes whereas others documented broader, bigger-picture changes. We kept reported adaptations as is (whether they encompassed one or a collection of adaptations), classifying them as one adaptation as defined by the RPs. However, future work could refine how adaptations are defined (for e.g., content vs. delivery adaptations) and apply that definition consistently across programs.

Throughout our process, we found it challenging to determine what constitutes an adaptation. Does scope matter? Does magnitude? For example, does tailoring patient-facing materials with a logo that includes a new community partner's name constitute an adaptation, or is it simply “standard practice” in study start-up? We wrestled with this example and others. In our discussions, we observed that often the perceived size or type of adaptation was not necessarily the best predictor of its importance (as perceived by the RPs) and that understanding the impact of the adaptation on implementation and effectiveness outcomes can be a better indicator of whether an adaptation (regardless of its magnitude) should be documented. In the context of the logo change example, a determination needs to be made if this modification was expected to lead to better implementation or effectiveness outcomes, such as the reach of the community. If yes, then it should be considered an adaptation and documented as such. If no meaningful change in outcomes is expected, then the change would not rise to the level of adaptation. As such, the same change could be seen as an adaptation or not depending on the study context. This determination took much discussion and time and underlies the need for greater guidance around adaptations documentation, such as criteria that can be applied each time the “is it an adaptation?” question arises. McCreight and colleagues describe a similar approach to determining whether a change should be documented as an adaptation (18). Although not within scope for this project, understanding the impact of adaptations on outcomes will be critical to advancing the study of adaptations within implementation science.

Despite agreement across the consortium to document and report adaptations, several RPs initially expressed some hesitancy to participate in the data collection process. This reluctance was largely related to time and effort involved in the process of tracking, documenting, and reporting the adaptations considering competing priorities. To a lesser extent, reluctance was due to the novelty of adaptations. Awareness that adaptations take place in planned interventions did not translate to deep understanding about or agreement with the importance of capturing adaptations. The limited time available for tracking and lack of ready tracking systems and tools served as barriers to embracing and executing adaptations tracking in practice, and thus are critical considerations when planning for documenting adaptations.

We offer additional considerations for documenting and sharing information about adaptations for groups such as consortia who are considering tracking adaptations across their respective partners. These suggestions highlight the importance of giving attention to adaptations, defining them a priori, and having simplified methods of data collection, tracking, analysis, and reporting.

4.1 Develop a shared approach to adaptations

It will be critical for research teams across sites and/or programs to have a shared vision of the role of adaptations and agree on the purpose of tracking adaptations for their specific project. At a minimum, this would include group discussions of the relevance of and rationale for adaptations to document, analyze, and report them, as well as consideration of how adaptations are defined and harmonized across programs. The vision could include agreement on a minimum number or type of adaptations to track or a specified period of tracking.

4.2 Clarify key intervention elements

We suggest that each entity define the intervention's key elements, both for the intervention and implementation strategies. When appropriate, we recommend harmonizing this across RPs. This will serve to develop a shared understanding of both components and strategies as a first step to identifying adaptations to each. Programs aiming to do this can consider applying a framework or process, such as the functions and forms approach (25, 26).

4.3 Utilize harmonized instruments

Likewise, agreement on the use of shared tools (e.g., an Excel matrix) will be critical for comparing adaptations across groups. Development of a harmonized instrument with agreed-upon constructs, sub-constructs, and their operational definitions will aid data collection from, and comparison across, multiple RPs.

4.4 Develop an analysis and dissemination plan

Finally, we recommend that groups give thought ahead of time about what to do with the adaptation information collected, including how to analyze it, how to share it, in what form, and with whom. Will the findings be used to inform scale-up? To promote sustainment? For shared learning, and if so with whom? How will the information get back to clinic and community partners, and how can they be helped to utilize the information? These questions and others are important to consider, plan for, and document, if the collection of adaptations is to serve a greater purpose beyond an academic exercise.

The work herein provides an understanding of how information about adaptations is documented across diverse settings, setting the stage for understanding how tracking and analyzing adaptations can inform implementation and interpretation of study results within and across research studies. Understanding adaptations across study sites can help trans-ACCSIS efforts to interpret findings across the consortium, maximizing learning. Variation in implementation is inevitable, and lack of understanding of the ways in which sites vary can interfere with interpreting results and limit shared learning. Likewise, this work underscores the importance of tracking adaptations within single sites as well—changes to interventions and implementation strategies are inevitable and ignoring them can interfere not only with analysis, but also with understanding impact, replication, or sustainment efforts.

One of the earliest efforts to document adaptations in a larger program was established in two consecutive iterations of the U.S. Department of Veterans Affairs-funded Quality Enhancement Research Initiatives Triple Aim and Quadruple Aim QUERI programs. In these programs, a similar system was set up to document adaptations across the research projects (17, 18, 27).

Parallel to our efforts to document adaptations systematically across RPs in the ACCSIS Program, other National Institutes of Health (NIH)-funded initiatives also engaged in such efforts, some of which are still in progress. Smith and colleagues (28) developed a novel systematic methodology (Longitudinal Implementation Strategies Tracking System; LISTS) to classify, document, and track the use of implementation strategies across three trials focused on improving symptom management as part of the NCI-funded Improving the Management of symptoms during and following Cancer Treatment (IMPACT) Consortium (28). We are also aware of ongoing efforts through the National Heart Lung and Blood Institute-funded Disparities Elimination through Coordinated Interventions to Prevent and Control Heart and Lung Disease (DECIPHeR) Alliance (29, 30) and the NCI-funded Exercise and Nutrition Interventions to Improve Cancer Treatment-Related Outcomes (ENICTO) Consortium (3133) to establish cross-program adaptation documentation using a combination of the RE-AIM expanded FRAME (13) and the function and form matrices (25). Although beyond the current scope of our project, future endeavors should investigate which combination of methods for documenting adaptations provides the most meaningful information, as well as how to evaluate the impact of adaptations on a wide range of implementation outcomes.

Key strengths of our study included the use of multiple data sources, the development of a systematic, theory-driven approach applicable across multiple RPs, and engagement with diverse, partners in the context of the documentation process.

Our study has some important limitations. Practical considerations led us to select a subset of constructs from the FRAME framework (11). Applying the full framework, including conceptualizing and operationalizing the constructs and data collection and analysis, would not have been feasible given program staff time constraints. We selected what we believed to be the most pragmatic constructs but may have missed other constructs important to understanding adaptations across our programs. This decision proved to be a necessary balance between complete application and feasibility for collecting adaptation information which requires careful consideration in all research studies undertaking documentation of adaptations.

To further reduce burden, and in the spirit of piloting this work, we asked each RP to provide examples of adaptations made rather than a comprehensive list. The examples were intended as a first pass at understanding what adaptations were being made, how they were operationalized, and how they were being captured. As such, we cannot make any claims or assumptions about the entirety of adaptations made across programs. We cannot, for example, make inferences about the number of adaptations made by intervention stage (e.g., early, mid, late). While the decision to include sample adaptations was intentional (i.e., demonstrate feasibility and the methodology), this limits the generalizability of our findings and directs the primary focus on the methodology.

Although RPs had diverse definitions for adaptations, over time and through group discussions and multiple iterations of developing the tables, there was a natural process of harmonization. Thus, although the definitions provided in the tables reflect some diversity of thought, there is also potential “group think”; individual program variation specific to context may have been lost. However, programs shared their open and forthright thoughts about adaptations in their context, suggesting that program-specific ideas and practices remained evident throughout the adaptation data collection process. Paradoxically, the process of defining adaptations also helped programs reflect on the meaning and implication of adaptations and helped them develop a shared understanding of adaptations in their specific contexts, suggesting that settling on a definition, however tending toward the center, is also the result of careful consideration by individual programs.

Our analysis cannot address how adaptations to interventions and implementation strategies interact. This exploration is being specified in a parallel program and has not been used to inform the current analysis. Furthermore, the current paper is primarily focused on demonstrating the feasibility of documenting adaptations across multiple RPs and did not attempt to analyze the impact of adaptations on diverse implementation and effectiveness outcomes. While still relatively new to the field, a recent publication by Aschbrenner and colleagues discussed guidance on how to consider adaptation impact. Future work should incorporate the assessment of adaptation on key outcomes (34).

5 Conclusion

Our work provides an important contribution to understanding how adaptations are characterized and monitored in practical settings and illuminates the need for pragmatic approaches and multiple methods of capturing adaptations tailored to the priorities and resources of RPs and partners. This understanding can serve implementation scientists and community members alike in understanding the role and power of adaptations in their own work. Consideration should be given to the importance of monitoring adaptations, defining them a priori, and having simplified methods of data collection, tracking, analysis, and reporting. The approach presented in this paper can provide a template for future replication studies and adaptation tracking in other consortia.

Data availability statement

The original contributions presented in the study are included in the article/Supplementary Material, further inquiries can be directed to the corresponding author.

Author contributions

BR: Methodology, Conceptualization, Investigation, Writing – review & editing, Writing – original draft, Formal analysis. EK: Formal analysis, Writing – original draft, Conceptualization, Methodology, Writing – review & editing. JO: Formal analysis, Writing – original draft, Methodology, Writing – review & editing, Conceptualization. AK: Formal analysis, Writing – original draft, Methodology, Conceptualization, Writing – review & editing. SH: Writing – original draft, Writing – review & editing. UM: Writing – review & editing, Writing – original draft. MD: Writing – review & editing, Writing – original draft. PA: Writing – review & editing, Writing – original draft. SM: Writing – review & editing, Writing – original draft. KE: Writing – original draft, Writing – review & editing. JN: Writing – original draft, Writing – review & editing. HL: Writing – original draft, Writing – review & editing. KK: Writing – review & editing, Writing – original draft. JC: Writing – original draft, Writing – review & editing. MD: Writing – original draft, Writing – review & editing. TM: Writing – review & editing, Writing – original draft. SK: Writing – review & editing, Writing – original draft. SS: Writing – review & editing, Writing – original draft. RF: Writing – original draft, Conceptualization, Writing – review & editing, Methodology, Formal analysis.

Funding

The author(s) declare that financial support was received for the research and/or publication of this article. Funding for this manuscript was provided through the following grants: NCI/NIH 5UH3CA233314-03 (Rabin, Nodora); NIH/NCI UH3CA244298 (Kenzie, Coury, Davis); NIH/NCI 4UH3CA233282-02 (Oliveri); NCI/NIH 5UH3CA233282 (Kruse-Diehr); NCI/NIH 1 U24 CA233218-01 (Hoover, Subramanian), NIH/NCI P30 CA023074 (Menon); NIH/NCI P30CA225520 (Doescher); NIH/NCI P30CA118100-16S4 (Adsul, Mishra, English); 5UH3CA233229-04 (Lam, Kim); 1UG3CA233251 and 5UH3CA233251 (Malo and Ferrari).

Acknowledgments

The authors thank and gratefully acknowledge:

• RTI for editing the manuscript and coordinating the workgroup meeting;

• ACCSIS-San Diego program, including the participating community health centers, Health Quality Partners, and all members of the research team; [San Diego]

• ACCSIS-Oregon program clinic, health system, and community partners and all members of the research team; [Oregon]

• ACCSIS-Appalachia program including the participating community health centers, community partners, and all members of the research team; [Appalachia]

• Tribes and Tribal Healthcare facilities; [Arizona]

• Tribes and participating Indian Health Service, Tribal and urban Indian health care facilities; [Oklahoma]

• Tribes and Tribal health care facilities; [New Mexico]

• ACCSIS-Chicago program including the participating federally qualified health centers and all members of the research team; [Chicago]

• ACCSIS-North Carolina (SCORE) program health systems and clinic partners, community partners, and the research staff [North Carolina] for their exceptional work and contribution and enthusiastic participation in this program.

Conflict of interest

KE was employed by Albuquerque Area Indian Health Board, Inc.

The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Generative AI statement

The author(s) declare that no Generative AI was used in the creation of this manuscript.

Any alternative text (alt text) provided alongside figures in this article has been generated by Frontiers with the support of artificial intelligence and reasonable efforts have been made to ensure accuracy, including review by the authors wherever possible. If you identify any issues, please contact us.

Publisher's note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

Supplementary material

The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/frhs.2025.1613925/full#supplementary-material

Abbreviations

ACCSIS, Accelerating Colorectal Cancer Screening and follow-up through Implementation Science; AI/AN, American Indian and Alaska native; CCM, configurational comparative methods; CRC, colorectal cancer; DECIPHeR, disparities elimination through coordinated interventions to prevent and control heart and lung disease; DSF, dynamic sustainability framework; ENICTO, exercise and nutrition interventions to improve cancer treatment-related outcomes; FQHC, federally qualified health center; FRAME, framework for reporting adaptations and modifications to evidence-based interventions; IMPACT, improving the management of symPtoms during And following cancer treatment; iPARIHS, integrated promoting action on research implementation in health services; LISTS, longitudinal implementation strategies tracking system; QCA, qualitative comparative analysis; QUAL, qualitative data; QUAN, quantitative data; QUERI, quality enhancement research initiative; RE-AIM, reach, effectiveness, adoption, implementation, maintenance framework; RP, research program; SCORE, scaling colorectal cancer screening through outreach, referral, and engagement.

References

1. Rabin BA, Viglione C, Brownson RC. Terminology for dissemination and implementation research. In: Brownson RC, Colditz GA, Proctor EK, editors. Dissemination and Implementation Research in Health. 3rd ed. New York, NY: Oxford University Press (2023). p. 27–65.

Google Scholar

2. Bauman AA, Stirman SW, Cabassa LJ. Adaptation in dissemination and implementation science. In: Brownson RC, Colditz GA, Proctor EK, editors. Dissemination and Implementation Science in Health. 3rd ed. New York: Oxford (2023). p. 172–91.

Google Scholar

3. Chambers DA, Glasgow RE, Stange KC. The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implement Sci. (2013) 8:117. doi: 10.1186/1748-5908-8-117

PubMed Abstract | Crossref Full Text | Google Scholar

4. Moore JE, Bumbarger BK, Cooper BR. Examining adaptations of evidence-based programs in natural contexts. J Prim Prev. (2013) 34(3):147–61. doi: 10.1007/s10935-013-0303-6

PubMed Abstract | Crossref Full Text | Google Scholar

5. Copeland L, Littlecott H, Couturiaux D, Hoddinott P, Segrott J, Murphy S, et al. The what, why and when of adapting interventions for new contexts: a qualitative study of researchers, funders, journal editors and practitioners’ understandings. PLoS One. (2021) 16(7):e0254020. doi: 10.1371/journal.pone.0254020

PubMed Abstract | Crossref Full Text | Google Scholar

6. Hickam D, Totten A, Berg A, Rader K, Goodman S, Newhouse R. The PCORI Methodology Report. Washington: PCORI (2013).

Google Scholar

7. Frontiers in Public Health. Understanding, assessing, and guiding adaptations in public health and health systems interventions: current and future directions. (2023). Available online at: https://www.frontiersin.org/research-topics/29317/understanding-assessing-and-guiding-adaptations-in-public-health-and-health-systems-interventions-current-and-future-directions (Accessed June 28, 2024).

Google Scholar

8. Rabin BA, Cain KL, Glasgow RE. Adapting public health and health services interventions in diverse, real-world settings: documentation and iterative guidance of adaptations. Annu Rev Public Health. (2025) 46(1):111–31. doi: 10.1146/annurev-publhealth-071321-04165

PubMed Abstract | Crossref Full Text | Google Scholar

9. Escoffery C, Lebow-Skelley E, Udelson H, Boing EA, Wood R, Fernandez ME, et al. A scoping study of frameworks for adapting public health evidence-based interventions. Transl Behav Med. (2019) 9(1):1–10. doi: 10.1093/tbm/ibx067

PubMed Abstract | Crossref Full Text | Google Scholar

10. Moore G, Campbell M, Copeland L, Craig P, Movsisyan A, Hoddinott P, et al. Adapting interventions to new contexts-the ADAPT guidance. Br Med J. (2021) 374:n1679. doi: 10.1136/bmj.n1679

PubMed Abstract | Crossref Full Text | Google Scholar

11. Wiltsey Stirman S, Baumann AA, Miller CJ. The FRAME: an expanded framework for reporting adaptations and modifications to evidence-based interventions. Implement Sci. (2019) 14(1):58. doi: 10.1186/s13012-019-0898-y

PubMed Abstract | Crossref Full Text | Google Scholar

12. Pinnock H, Barwick M, Carpenter CR, Eldridge S, Grandes G, Griffiths CJ, et al. Standards for reporting implementation studies (StaRI) statement. Br Med J. (2017) 356:i6795. doi: 10.1136/bmj.i6795

PubMed Abstract | Crossref Full Text | Google Scholar

13. Rabin BA, McCreight M, Battaglia C, Ayele R, Burke RE, Hess PL, et al. Systematic, multimethod assessment of adaptations across four diverse health systems interventions. Front Public Health. (2018) 6:102. doi: 10.3389/fpubh.2018.00102

PubMed Abstract | Crossref Full Text | Google Scholar

14. National Cancer Insstitute. Cancer Moonshot℠. (n.d). Available online at: https://www.cancer.gov/research/key-initiatives/moonshot-cancer-initiative (Accessed June 28, 2024).

Google Scholar

15. ACCSIS. Accelerating colorectal cancer screening and follow-up through implementation science. (n.d). Available online at: https://accsis.rti.org/ (Accessed June 28, 2024).

Google Scholar

16. National Institutes of Health, National Cancer Institute, Division of Cancer Control & Population Sciences. Accelerating Colorectal Cancer Screening and follow-up through Implementation Science (ACCSIS). (2024). Available online at: https://healthcaredelivery.cancer.gov/accsis/ (Accessed April, 19).

Google Scholar

17. McCarthy MS, Ujano-De Motta LL, Nunnery MA, Gilmartin H, Kelley L, Wills A, et al. Understanding adaptations in the veteran health Administration’s transitions nurse program: refining methodology and pragmatic implications for scale-up. Implement Sci. (2021) 16(1):71. doi: 10.1186/s13012-021-01126-y

PubMed Abstract | Crossref Full Text | Google Scholar

18. McCreight M, Rohs C, Lee M, Sjoberg H, Ayele R, Battaglia C, et al. Using a longitudinal multi-method approach to document, assess, and understand adaptations in the veterans health administration advanced care coordination program. Front Health Serv. (2022) 2:970409. doi: 10.3389/frhs.2022.970409

PubMed Abstract | Crossref Full Text | Google Scholar

19. Card JJ, Solomon J, Cunningham SD. How to adapt effective programs for use in new contexts. Health Promot Pract. (2011) 12(1):25–35. doi: 10.1177/1524839909348592

PubMed Abstract | Crossref Full Text | Google Scholar

20. Miller CJ, Barnett ML, Baumann AA, Gutner CA, Wiltsey-Stirman S. The FRAME-IS: a framework for documenting modifications to implementation strategies in healthcare. Implement Sci. (2021) 16(1):36. doi: 10.1186/s13012-021-01105-3

PubMed Abstract | Crossref Full Text | Google Scholar

21. Harvey G, Kitson A. PARIHS Revisited: from heuristic to integrated framework for the successful implementation of knowledge into practice. Implement Sci. (2016) 11:33. doi: 10.1186/s13012-016-0398-2

PubMed Abstract | Crossref Full Text | Google Scholar

22. Finley EP, Huynh AK, Farmer MM, Bean-Mayberry B, Moin T, Oishi SM, et al. Periodic reflections: a method of guided discussions for documenting implementation phenomena. BMC Med Res Methodol. (2018) 18(1):153. doi: 10.1186/s12874-018-0610-y

PubMed Abstract | Crossref Full Text | Google Scholar

23. Kenzie ES, Patzel M, Nelson E, Lovejoy T, Ono S, Davis MM. Long drives and red tape: mapping rural veteran access to primary care using causal-loop diagramming. BMC Health Serv Res. (2022) 22(1):1075. doi: 10.1186/s12913-022-08318-2

PubMed Abstract | Crossref Full Text | Google Scholar

24. Birken SA, Nilsen P, Cragun D. Configurational comparative methods. In: Sarah A, Birken PN, editors. Handbook on Implementation Science. Cheltenham: Edward Elgar Publishing (2020). p. 569.

Google Scholar

25. Perez Jolles M, Lengnick-Hall R, Mittman BS. Core functions and forms of complex health interventions: a patient-centered medical home illustration. J Gen Intern Med. (2019) 34(6):1032–8. doi: 10.1007/s11606-018-4818-7

PubMed Abstract | Crossref Full Text | Google Scholar

26. Terrana A, Viglione C, Rhee K, Rabin B, Godino J, Aarons GA, et al. The core functions and forms paradigm throughout EPIS: designing and implementing an evidence-based practice with function fidelity. Front Health Serv. (2023) 3(3):1281690. doi: 10.3389/frhs.2023.1281690

PubMed Abstract | Crossref Full Text | Google Scholar

27. Cohen DJ, Crabtree BF, Etz RS, Balasubramanian BA, Donahue KE, Leviton LC, et al. Fidelity versus flexibility: translating evidence-based research into practice. Am J Prev Med. (2008) 35(5):S381–9. doi: 10.1016/j.amepre.2008.08.005

PubMed Abstract | Crossref Full Text | Google Scholar

28. Smith JD, Norton WE, Mitchell SA, Cronin C, Hassett MJ, Ridgeway JL, et al. The longitudinal implementation strategy tracking system (LISTS): feasibility, usability, and pilot testing of a novel method. Implement Sci Commun. (2023) 4(1):153. doi: 10.1186/s43058-023-00529-w

PubMed Abstract | Crossref Full Text | Google Scholar

29. Kho A, Daumit GL, Truesdale KP, Brown A, Kilbourne AM, Ladapo J, et al. The national heart lung and blood institute disparities elimination through coordinated interventions to prevent and control heart and lung disease alliance. Health Serv Res. (2022) 57 Suppl 1(1):20–31. doi: 10.1111/1475-6773.13983

PubMed Abstract | Crossref Full Text | Google Scholar

30. DECIPHeR. Welcome to the DECIPHeR Alliance. (n.d). Available online at: https://decipheralliance.org/ (Accessed June 28, 2024).

Google Scholar

31. The ENICTO Consortium. Welcome to ENICTO. (n.d.). Available online at: https://enicto.bsc.gwu.edu/web/enicto (Accessed June 28, 2024).

Google Scholar

32. Perna F, Agurs-Collins T. Exercise and Nutrition to Improve Cancer Treatment Outcomes (ENICTO): a place for implementation science in efficacy trials? National Institutes of Health National Cancer Institute: Division of Cancer Control & Population Sciences (2024). Available online at: https://cancercontrol.cancer.gov/is/blog/exercise-and-nutrition-to-improve-cancer-treatment-outcomes-a-place-forimplementation-science-in-efficacy-trials (Accessed June 28, 2024).

Google Scholar

33. Schmitz K, Brown J, Irwin M, Robien K, Scott J, Berger N, et al. Exercise and nutrition to improve cancer treatment-related outcomes: the ENICTO consortium. J Natl Cancer Inst. (2025) 117(1):9–19. doi: 10.1093/jnci/djae177

PubMed Abstract | Crossref Full Text | Google Scholar

34. Aschbrenner KA, Rabin BA, Bartels SJ, Glasgow RE. Methodological recommendations for assessing the impact of adaptations on outcomes in implementation research. Implement Sci. (2025) 20(1):30. doi: 10.1186/s13012-025-014

PubMed Abstract | Crossref Full Text | Google Scholar

Keywords: adaptation, implementation science, implementation strategies, mixed methods, colorectal cancer screening, data harmonization

Citation: Rabin BA, Kenzie ES, Oliveri JM, Kruse-Diehr AJ, Hoover S, Menon U, Doescher MP, Adsul P, Mishra SI, English K, Nodora J, Lam H, Kim K, Coury JK, Davis MM, Malo T, Kobrin S, Subramanian S and Ferrari RM (2025) Documenting adaptations across the Accelerating Colorectal Cancer Screening and follow-up through Implementation Science research programs: methods and adaptation examples. Front. Health Serv. 5:1613925. doi: 10.3389/frhs.2025.1613925

Received: 18 April 2025; Accepted: 5 September 2025;
Published: 26 September 2025.

Edited by:

Ucheoma Catherine Nwaozuru, Wake Forest University, United States

Reviewed by:

Thembekile Shato, Washington University in St. Louis, United States
Snehil Kumar Singh, UNICEF United Nations International Children’s Emergency Fund, United States

Copyright: © 2025 Rabin, Kenzie, Oliveri, Kruse-Diehr, Hoover, Menon, Doescher, Adsul, Mishra, English, Nodora, Lam, Kim, Coury, Davis, Malo, Kobrin, Subramanian and Ferrari. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Borsika A. Rabin, YmFyYWJpbkBoZWFsdGgudWNzZC5lZHU=

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.