Abstract
Background:
Adaptive implementation strategies tailor support to setting needs rather than applying a uniform approach. These strategies improve efficiency and fit, yet practical guidance on identifying decision points and tailoring variables is limited. This study collected end-user and partner input to specify decision points and tailoring variables for an adaptive implementation strategy.
Methods:
This study focused on the evidence-based nutrition program, Together, We Inspire Smart Eating (WISE). End users and implementation partners with prior experience in WISE were recruited in two states to participate in semi-structured interviews or focus groups designed to elicit feedback to specify an adaptive implementation strategy for WISE.
Results:
Qualitative input supported three crucial decisions for an adaptive implementation strategy: (1) low-intensity support, the starting point for all sites, will include leadership commitments, local champions, an implementation blueprint, classroom reminders, and task-focused facilitation at the site level; (2) assessment of response to low-intensity support will occur in October (Month 3) of the school year; and (3) sites not responding by Month 3 will receive holistic facilitation and tailored educational materials at the teacher level. Participants emphasized the universal need for facilitation at all sites, with struggling sites requiring more. They also identified tailoring variables: sites with fewer than 60% of classrooms achieving fidelity would require high-intensity support.
Conclusions:
This study illustrates a process for using feedback from end users and partners to define key elements of an adaptive implementation strategy. Our approach holds significant potential to specify strategies for scaling health-related evidence.
Introduction
Implementation science aims to close the gap between evidence generation and evidence use in everyday life (1). Adaptive implementation designs help determine “who needs what” when it comes to implementation support to realize that goal (2). These designs test adaptive implementation strategies that increase or tailor implementation support in a systematic, predefined fashion and support the broader implementation science goal of optimizing strategies to fit diverse contexts and needs (3, 4). In doing so, they reflect that a uniform approach rarely fits all settings (5). Further, providing more than what is required can be costly and limit the total reach and penetration of an implementation effort through inefficient allocation of resources.
To combat the limitations of a “one-size-fits-all” approach that tends to be reflected in early implementation science studies (6), adaptive implementation strategies offer crucial decision points and tailoring variables to maximize resources. Adaptive implementation strategies allow for matching support with readiness and responsiveness. Specifically, crucial decision points specify when and how implementation support should be adjusted, while the tailoring variable is the key indicator used to guide those adjustments (2, 3). Through adaptive implementation trials, these elements are tested and refined to create reproducible guides for determining who receives what support and when in real-world practice.
To date, little guidance exists on who should make these crucial decisions for adaptive implementation strategies and how these decisions should be made. This lack of practical guidance limits researchers' and implementers' ability to design adaptive strategies that are both justified and replicable. In preparing for our enhanced non-responder trial on obesity prevention practices in early care and education settings (7), our team recognized this gap and engaged in a structured process to incorporate end-user and partner input in defining the key elements of the adaptive strategy. An enhanced non-responder trial is a type of adaptive implementation design that begins with an initial, standard implementation strategy for all sites and then provides additional or intensified support only to those sites that do not respond adequately to the initial strategy (3). Guided by this framework, we worked collaboratively to specify: (1) what level of initial (“low-intensity”) support all sites would receive, (2) how and when a response to those strategies would be assessed, (3) what additional strategies would be offered to non-responders, and (4) what tailoring variable would be used to guide those adaptations (4). This Brief Research Report describes this collaborative process and how end-user and partner feedback informed the crucial decision points and tailoring variables of the adaptive implementation strategy tested in the trial. This work advances implementation science by providing transparent, replicable guidance on how crucial tailoring and adaptation decisions are made, which is an essential step toward greater efficiency and contextual fit in adaptive implementation design.
Methods
The evidence-based innovation
Together, We Inspire Smart Eating (WISE) is an evidence-based intervention designed for early care and education (ECE) settings serving children impacted by poverty. WISE was co-designed with end users through an iterative process involving focus groups, pilot testing, and systematic refinement based on parent and teacher feedback. WISE promotes evidence-based practices (EBPs) in nutrition with the goal of improving children's diets and preventing obesity and obesity-related disease (e.g., cancer, diabetes, heart disease) (8, 9). Evidence supports benefits associated with WISE both overall and for each of its core EBPs: (1) multiple hands-on exposures to fruits and vegetables to support food acceptance (10–16), (2) role modeling by educators to allow children to observe a trusted adult interacting with novel foods (17–19), (3) positive feeding practices to support children's self-regulation and autonomy (19–21), and (4) use of a mascot to provide a familiar character linked with fruits and vegetables (22–27). Compared to usual educational practices, WISE increased fruit and vegetable intake at home for children, according to parent reports (28), as well as objective measures of carotenoid intake (29).
Study design
To explore perceptions of WISE implementation, we drew on data from ECE educators (N = 82) who participated in implementations of WISE in two prior studies across two states and six sites were included in the potential participant pool. In both studies, an enhanced implementation approach (“high intensity”) was used, which is described fully in Swindle et al. (49). Survey data were used to identify ECEs with high (top quartile) and low (bottom quartile) ratings of WISE feasibility, which informed purposive sampling for semi-structured interviews. We selected feasibility as the target outcome for sampling to capture perceptions of the innovation, recognizing that fidelity can remain high even when attitudes toward implementation are neutral or negative, which may undermine long-term sustainment (30). To broaden perspective, we also interviewed center directors (N = 4, 80% of those eligible) and food service staff (N = 8, 75% of those eligible) across the six sites, creating a multi-stakeholder view of implementation. The study was reviewed and approved by the Institutional Review Board (IRB) at the University of Arkansas for Medical Sciences. Consent for interviews was provided verbally in accordance with IRB-approved procedures.
Measures
The integrated-Promoting Action on Research Implementation in Health Services (i-PARiHS) framework guided the development of the interview guide. The i-PARiHS framework posits that core components of successful implementation include innovation (the EBPs), recipients (individual and collective), context (inner and outer), and facilitation (31). In line with these domains, questions about context explored how staff recognized peers who were excelling or struggling with WISE and how the implementation team identified centers that needed additional support. Questions about innovation focused on perceptions of the value and cost-effectiveness of possible implementation strategies. Questions about the recipients examined which WISE resources were used most and why by each end-user group. Finally, questions about facilitation asked participants to prioritize the most crucial implementation supports if only a few could be provided. According to i-PARiHS, successful implementation takes place when facilitation promotes the acceptance and use of an innovation based on the recipients' needs and the nature of the context. Interviews explored how these elements influenced WISE delivery. See example questions in Table 1. After the interview, the interviewer provided a summary of the interview to the participant with the invitation to correct any inaccurate conclusions, provide additional detail, and clarify any comments, i.e., participant validation (32) and member checking (33). All the participants were paid a $50 check in accordance with IRB-approved procedures. Data collection continued until saturation was achieved, defined by consensus among the coders as the point when no new ideas emerged from additional interviews (34).
Table 1
| i-PARiHS constrct | Sample interview questions |
|---|---|
| Context | How long was it after WISE training before you knew which of your peers liked WISE? And which were struggling? How would the WISE team know who at your center needed our help? |
| Innovation | I would like to share the cost per classroom of each of the various implementation strategies we used to support you and your centers in WISE. Which seems like a good value? Which are too costly for the value you received? |
| Recipients | Which of the resources provided to you to support WISE implementation did you use most and why? |
| Facilitation | If we could have only given you three implementation supports, what would have been most crucial and why? |
Sample interview questions by i-PARiHS construct.
Participants
In total, we interviewed 11 teachers and 4 administrators and conducted two focus groups with 8 food service staff. All the participants were female, and 83% were Black. Of the six sites, two were located in a town with a population of 22,330; the remainder were in an urban city with a population of 204,774. Further, five of the six sites were Head Start locations serving children impacted by poverty; one site was a public Pre-K serving children across a variety of income levels. Participants were evenly split across the two studies and the states from which they were drawn. This range of participants allowed us to examine perspectives from multiple roles and organizational levels involved in WISE implementation.
Analyses
Interviews were transcribed verbatim using a third-party, Health Insurance Portability and Accountability Act (HIPPA)-compliant transcript service. We used a rapid, directed content analysis with a deductive lens (50), an approach well-suited for the focused objective of our interviews to identify the most useful aspects of WISE implementation support for specifying the adaptive implementation strategy. Excel-based matrices were used to organize the data, as is common in rapid analytic approaches (35, 36). Coders (N = 3) met to establish a codebook with shared meaning and consensus of code application using four interviews. Afterward, coding was independent with regular consensus meetings to reduce bias, ensure reliability and validity, and expand the codebook as needed. The team then met to review the coded data, identify recurring patterns, and develop consensus on the alignment of codes with the deductive purpose (i.e., define critical decisions and tailoring variables). Through iterative discussion and data comparisons, preliminary decisions were proposed, which were confirmed against the full dataset with consideration of supporting and countering perspectives.
Results
Low-intensity support
Consistent with the conceptualization of an adaptive implementation strategy, input from participants suggested that a “light touch” of support would be sufficient to support success for many classrooms and sites. These low-intensity strategies addressed the i-PARiHS constructs of context (i.e., a formal commitment from leadership, a local champion) and facilitation (task-focused monthly technical support) to provide practical problem-solving. The commitment from leadership was consistently nominated among the important strategies to keep by participants, while the local champion was perceived to be a high-value strategy relative to the cost (See Table 1, Facilitation sample question). Incentives were infrequently nominated as a top strategy and eliminated because participants perceived the incentives as: (1) potentially undermining the relationship between the team and the end users and (2) encouraging extrinsic motivation instead of intrinsic motivation.
Table 2 provides illustrative quotes that support the inclusion of strategies in the low-intensity, foundational package. The quotes show that leadership commitment inspired enthusiasm and collective purpose among teachers; the implementation blueprint offered clarity about expectations; and the local champion served as a trusted liaison who kept communication flowing between the WISE team and centers. Participants also described how visual cues, such as the cutting board, helped keep lessons salient in daily routines and how task-focused facilitation provided timely, hands-on problem-solving when barriers arose. Together, these examples demonstrate that partners valued strategies that were practical, visible, and immediately useful. These features were suggested to support the recipients’ engagement with the innovation.
Table 2
| Strategy | Partner input |
|---|---|
| Formal commitment from leadership | “And they (leadership) believed that it was making a difference in the lives of the children. So, if you see somebody that's really excited about what they're doing. The, you know, they're energized there. You know, just, ‘Hey, guys. We're gonna do this. We can do this to make the children better. We're going to make them happier. We're gonna make them healthier.’ That's all I needed.”—Teacher |
| Implementation blueprint | “Maybe the blueprint … them telling us exactly what they kind of expected of us and what they needed done …. helping them on the same page. That was where they told us, you know. They gave us the supplies and told us you know what they expect.”—Director |
| Local champion | “For the sites, I would say the number one probably helpful would be the champion because that was the person, the one what would take the information back and forth from the meetings to the sites. So, for the sites, I would say that probably having that champion be the liaison between the two was probably the most helpful.”—Administrator |
| “The site champions … was most helpful. I'm very comfortable with my champion, and, I mean, it's really easy to get to her because she was just like, right there. So I'm like, rather than like emailing and not being able to have like an actual like face-to-face conversation. It's nice to have a conversation with the site champion and be like, ‘Hey, I don't know how to do this and this.’ … Or like getting ideas from her. You know her telling me what she's doing in the classroom, what's working for her. It was it's nice to have that like actually at the site.”—Teacher | |
| Cutting board reminder | “The staff, our staff like to have visuals, and providing visuals works well with them. And to be honest, I have to have a visual, you know, to keep me on-on task.”—Administrator |
| Task-focused facilitation | “I think the thing that was the most helpful was the coaching along the way. Because I think it was real important that when we ran into obstacles, that, you know, we could come into you and say, ‘Hey, this isn't working. You know, we need some-some different ideas.’ I remember, we'd had an issue with some of our teachers wanting to go, like, directly by the book. And they got really upset when we were limited as to what we could order, and so they would get very upset if we couldn't get the exact things that they needed. And they didn't wanna, you know, kinda change things up. I remember we came to you and said, ‘Can you help us?’ And so you guys were always there as a resource, you know.”—Director |
| “And then, like I said, anytime we had an issue, I mean, it was—didn't take—it was the same day, we would try to have it resolved. You know, when we'd call and say, ‘Help,’ you know, you guys were right there to help us.”—Food Service Staff |
Salient quotes for partner input on low-intensity strategy.
Timing of response assessment
In addition to establishing the foundational package, input from the participants was used to inform the timing of the assessment of response to the low-intensity support, as well as the tailoring variable to determine response. For the timing of the assessment, suggestions ranged from 0.5 to 4 months into the school year, with the majority stating that October (approximately 2.5 months into the school year) would allow a good sense of which classrooms needed additional support. A center director stated:
“At first it was bumpy because they weren’t getting the materials that they needed…so there was a lot of adjusting that we had to do. But then once you would see the projects happening, the-teachers facilitating it. So I’d say maybe about three months after we kinda got a hang of things.”
A teacher also reflected:
“But after a while, of getting used to it, then I heard some other teachers being like, ‘Okay, it’s actually not so bad. It just takes some time to ease into’. And I say a few months …. maybe, October, November. Because that way, that gives the teacher some time at the beginning of the year to get settled into a routine. But it’s not too late to where you've kind of ruined the whole concept of building up WISE in the machine. So I would think, probably, October, November.”
An administrator supported these perspectives, stating,
“You know, right off the bat, you know, I knew we were struggling a little bit. But, you know, I—that’s why I said I think probably by—within October or November, everybody had a good feel for it.”
Taken together, participants emphasized that assessment of response to the low-intensity support should occur after an initial adjustment period but early enough to identify classrooms that may benefit from additional assistance. Across roles, there was consistent agreement that October or November, approximately 2–3 months into the school year, represents an optimal window for determining whether classrooms are responding adequately or may need to transition to higher-intensity support. This timing reflects a sensitivity to context, recognizing that educators need an initial stabilization period within their organizational and classroom routines before meaningful assessment of implementation can occur.
Tailoring variable to assess response
Input on the tailoring variable emphasized the importance of seeing implementation firsthand. Specifically, participants suggested observing teachers’ behaviors during the lesson (use of inappropriate practice, use of lesson plan, willingness to welcome WISE staff and talk with them), teachers' attitudes during the lesson (e.g., excitement, enthusiasm, comfort), and children's behaviors (e.g., awareness of Windy, excitement based on lesson cues). A center director recommended:
“I think just from observing them actually do one of the lessons, you know, you can typically identify, you know, who’s comfortable, and who’s not, um, with certain topics. And so if someone’s not real comfortable … I would identify them maybe as needing some additional support, or if you had some that just kinda didn’t seem interested. And-and like I said, I think most of it, uh, would be from observation.”
Teachers provided a similar perspective on the approach to identifying who may need additional help:
“See if the teacher was like willing to talk about WISE. … if they were just like, not welcoming. I think that that teacher would need additional support. I'm looking around the room to see if anything WISE related was on the walls or just presented anywhere in the room. If you had conversations with kids and like, ‘What has Windy the the owl brought to you?’ The kids can talk about it.”
An agency administrator also described how witnessing activities firsthand assured her that the implementation of WISE was going well or helped to identify someone who may need extra support:
“Whatever the activity …. That you-you could just see it happen. Even with the WISE puppet, you would see it in there at circle time or group time being pulled out as well as during the activities. The children would be able to speak about it …. you know, whenever I'm out in the sites, I get to be able to see it … and if the teacher, you know, wasn't excited at least while they were introducing the-the activity to the-the children then that would bring some concern to me as well.”
Across roles, participants consistently emphasized that identifying which classrooms may need additional support (i.e., assessing response via a tailoring variable) required seeing implementation firsthand, rather than relying on self-report or documentation. Their feedback directly informed the decision to operationalize the tailoring variable through onsite fidelity observations (51) of the four WISE EBPs and to set a threshold under which a teacher would be deemed to require a shift from low-intensity to high-intensity strategies (i.e., scoring below fidelity for two or more practices). Observing lessons in person was viewed as the most valid and practical way to assess teacher engagement, comfort, and use of the WISE materials and EBPs.
High intensity
The strategies in the high-intensity implementation package reflect input from participants on the best ways to support teachers who do not respond to the low-intensity support. That is, if the direct fidelity assessment suggested inadequate use of evidence-based practices, the addition of high-intensity support strategies would be warranted. These strategies include tailored educational materials and holistic, individualized facilitation. Tailored educational materials provided rationale and examples of how to enact WISE EBPs in the ECE classroom; educators received educational materials on the practices in which they struggled the most. Holistic facilitation differed from the task-focused facilitation of the low-intensity strategy by expanding to build rapport with the teachers and to work directly with them on their practices in the classroom (vs. helping to troubleshoot general barriers directly). This approach reflects the facilitation domain of i-PARiHS, emphasizing individualized coaching to enhance adoption of the innovation within the classroom context when recipient-level barriers are more complex.
Table 3 provides supportive quotes for the inclusion and specification of the high-intensity strategies. Teachers described the tailored materials as concrete reminders that helped them internalize WISE concepts and adapt the language for their classrooms, while administrators emphasized their usefulness in reinforcing priorities within centers. Across roles, participants noted that individualized facilitation became increasingly effective as trust developed; the coaching relationship evolved from initial hesitation to genuine collaboration. Teachers reported feeling encouraged and supported rather than evaluated, and administrators observed that this ongoing, responsive coaching helped maintain motivation and consistency in implementing WISE.
Table 3
| Strategy | Partner Input |
|---|---|
| Tailored educational materials | “Well, one thing that I can't say was challenging for me was making sure I didn't use the terminology and sentences and phrases that I was used to using that I grew up with. You know, I had, I had gotten a little better with it. But things will still come out, you know, like, ‘You need to you need to try this before you get this.’ …. We only have so much time to eat. So if you're going to eat this. I need you to eat it, you know. But then I had to just let them eat whatever they want to eat, whenever they want to eat it, however they want to eat it.”—Teacher |
| “Visuals are always … I mean, we know that when there's something important that we want teachers to remember, we always have ‘em post those types of things on their, um, bulletin board.”—Director | |
| Holistic, individualized facilitation | “I really think that once they got used to the idea of the coaching, it really helped them. But again, like I said, they had to kinda ease into that and understand that you weren't there to get ‘em in trouble or to say, ‘You're not doing this right,’ or anything like that. That you were really there for them, and to help them, you know, get this pulled together. So I think that was probably the most beneficial. At first, I don't know how receptive they were. And then I think as time went on, you know, they saw that this was fun. And the kids really enjoy it. And they kinda bought into it, I guess, as more—I'm sayin' as the year went on. So I think that probably, um, increased with all the coaching that you guys gave them.”—Administrator |
| “It was still good to know that somebody appreciated what you were doing and that you know they encourage you, ‘Hey, that's a good thing you're doing.’ Sometimes as teachers we don't get that. And that really inspired me to do more, not only doing the WISE sessions, but in in the other areas in the classroom. It helps, you know … Even though it got easier for us to do it, they still never, you know, they never backed up. They were always still there. ‘Can I help you with this; can I help you with that? Do you have questions?’ It was always, ‘Do you have a concern? How do you feel about what you're doing? Do we need to talk about this? Do you need any more materials?’ I mean, it never stopped. It was like from the beginning when we were so nervous and anxious about it until you know right on up until we didn't go back to school anymore.”—Teacher |
Salient quotes for partner input on high-intensity strategy.
All decisions about crucial decision points and tailoring variables are summarized in Table 4. This table consolidates how partner input was translated into the final adaptive design, showing the logic that guided each element of the implementation strategy. It highlights when responses should be assessed, how fidelity observations function as the tailoring variable, and which supports are triggered for classrooms needing additional assistance. Together, these decisions reflect a systematic approach that balances feasibility with responsiveness, ensuring that implementation support is both efficient for most sites and sufficiently intensive for those requiring more hands-on guidance. Table 4, therefore, provides a concise overview of the full adaptive strategy that emerged from end-user and partner input.
Table 4
| Design feature | Definition | Decision |
|---|---|---|
| Crucial decision points | Which strategies to begin the study with (i.e., low-intensity) | Formal commitment with leadership; implementation blueprint; local champion; cutting board with reminders of WISE EBPs; task-focused monthly facilitation |
| How and when response is measured | October | |
| What strategies are given to non-responders (i.e., high-intensity) | Tailored educational materials; holistic, individualized facilitation | |
| Tailoring variables | Measurement to identify non-responders and inform strategy intensity | Observed fidelity to WISE EBPs |
Adaptive implementation strategies: decisions on design features.
EBPs, evidence-based practices.
Discussion
The use of adaptive implementation designs is increasing to evaluate implementation strategies that provide more resource-intensive implementation support to systems and implementers that do not respond to a basic level of support (3). This trend reflects a broader movement in implementation science toward tailoring and resource optimization, matching strategy intensity to site needs rather than applying uniform support (37). Several decisions are key in designing adaptive trials that will have important implications for the use of the adaptive implementation strategy in the real world (2). Engaging implementation partners in defining the decision points of adaptive trials remains uncommon, particularly in early childhood contexts where implementation infrastructure is often limited (38, 39). Within implementation science, participatory and co-design approaches are increasingly recognized as key to improving the relevance, feasibility, and sustainability of strategies (40), yet their integration into adaptive designs remains limited. Prior studies in health and education settings have demonstrated that partner-driven adaptation can improve the feasibility and acceptability of implementation strategies (41, 42). Our findings extend this work by showing how partner input can be systematically embedded at multiple decision points to ensure adaptive implementation designs are contextually grounded and operationally feasible.
Specifically, our team used input from ECE partners with prior experience with our innovation and implementation strategies to define a tailoring variable, determine the timing of the measurement of response, and generate low- and high-intensity strategies. Low-intensity support was specified to support contextual-level logistics and leadership support for implementation, focusing on creating a supportive setting for implementation. High-intensity support added tailored, holistic implementation facilitation and education at the implementer level, adding an explicit focus on directly supporting the people doing the implementation. Input from ECE partners informed the decision to base the tailoring variable on classroom observations and to measure response 3 months into the school year. Our work illustrates that input from relevant implementation partners is a promising approach to the design of adaptive implementation strategies.
Both the low-intensity and high-intensity strategies in our adaptive implementation package have facilitation at the foundation. This reflects that the focus and targets of facilitation can vary (43). Task-oriented facilitation delivers technical and practical support (43). Reflective of this focus, task-focused facilitation was designed to focus on context-level barriers and monthly contact with leadership and champions in our low-intensity strategy. However, the holistic facilitation in the high-intensity strategy was designed to encourage the development of shared understanding, interconnected networks, and personal growth (43). According to the i-PARiHS framework (44), which guided our study, effective implementation requires diverse facilitation strategies based on the nature of the innovation, the context, and the needs of the recipients. This aligns with a central tenet in implementation science that facilitation is a dynamic, context-dependent process, not a static intervention. Thus, although it is not yet common in implementation practice, facilitation as a strategy may need to differ in systematic and replicable ways to support implementation while optimizing resources. Our adaptive implementation strategy illustrates the clear specification of the targets of each type of facilitation. This distinction has important implications for scaling and sustainability. Task-focused facilitation may be more readily scaled, whereas holistic facilitation requires personnel with advanced relational and reflective skills, suggesting the need for differentiated training and resource allocation across contexts. Future efforts could test tiered models of facilitation to determine which elements can be delivered by internal staff vs. those that require external expertise and which are most predictive of successful scale and sustainment (43, 45).
Building on these insights, we also recognize several limitations that should inform the next phase of research. One limitation is that ECE staff reported they did not feel comfortable or informed to weigh in on cost-related information, frequently citing that it was beyond their role (even center directors). Thus, input on the design of our adaptive strategy was not closely connected to a payer perspective. Future studies that engage end users in the design of adaptive implementation strategies may benefit from the inclusion of potential payers' input. In addition, the qualitative sample was small and drawn from individuals with prior experience implementing WISE and with the research team, which may have contributed to favorable perceptions. Despite these limitations, the inclusion of multiple perspectives (i.e., teachers, kitchen staff, center directors), each with a unique role in the implementation process, strengthened the validity of our findings.
Conclusions
This study advances adaptive implementation research by providing a systematic process for integrating end-user and organizational partner input at multiple decision points in the design of an adaptive strategy, which reflects the increasing recognition that one size does not fit all when it comes to implementation strategies (4, 46). By demonstrating how partner insights can inform the specification of timing, tailoring variables, and intensity of facilitation, this work offers a replicable model for developing adaptive implementation approaches in other community and educational settings. Community contexts broadly, and the ECE setting specifically, often operate under resource constraints where optimizing personnel and financial resources is a crucial requirement (46–48). Our findings illustrate how context-sensitive, partnership-driven adaptation can make implementation supports both resource-efficient and responsive. Our ongoing enhanced non-responder trial will test whether this co-designed adaptive strategy improves implementation outcomes and cost-effectiveness relative to static approaches (7). In so doing, this work contributes to implementation science's broader aim of developing and testing reproducible, contextually grounded methods for improving implementation support.
Statements
Data availability statement
The datasets presented in this article are not readily available because the data cannot be fully anonymized. Requests to access the datasets should be directed to Taren Massey-Swindle at tswindle@uams.edu.
Ethics statement
This study involving humans was approved by the University of Arkansas for Medical Sciences. This study was conducted in accordance with the local legislation and institutional requirements. The ethics committee/institutional review board waived the requirement of written informed consent for participation from the participants or the participants' legal guardians/next of kin because this was research with minimal risk and verbal consent was obtained.
Author contributions
TM-S: Conceptualization, Formal analysis, Funding acquisition, Investigation, Methodology, Writing – original draft. JR: Methodology, Supervision, Writing – review & editing. SJ: Methodology, Writing – review & editing. GC: Methodology, Writing – review & editing.
Funding
The author(s) declare that financial support was received for the research and/or publication of this article. Research reported in this publication was supported by the National Cancer Institute of the National Institutes for Health (NIH) under Award Number NIH NCI R37CA25113. Geoffrey M. Curran and Taren Massey-Swindle are supported by the Translational Research Institute (TRI), UL1TR003107, through the National Center for Advancing Translational Sciences of the NIH. Taren Massey-Swindle and Julie M. Rutledge are supported by the National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK), R01 DK138153 and the Vitamix Foundation. Taren Massey-Swindle is supported by NIH P20GM109096 and the Arkansas Biosciences Institute. The content is solely the responsibility of the authors and does not necessarily represent the official views of the funding agencies.
Conflict of interest
TM-S and UAMS have a financial interest in the technology (WISE) discussed in this presentation/publication. These financial interests have been reviewed and approved in accordance with the UAMS conflict of interest policies.
The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Generative AI statement
The author(s) declare that Generative AI was used in the creation of this manuscript. Portions of the manuscript were revised with assistance from ChatGPT (OpenAI, San Francisco, CA), which supported the author in refining wording, improving organization, and enhancing clarity. The author reviewed and edited all AI-generated text to ensure accuracy, coherence, and alignment with the study's findings and interpretations.
Any alternative text (alt text) provided alongside figures in this article has been generated by Frontiers with the support of artificial intelligence and reasonable efforts have been made to ensure accuracy, including review by the authors wherever possible. If you identify any issues, please contact us.
Publisher’s note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
Abbreviations
WISE, Together, We Inspire Smart Eating; ECE, early care and education; EBPs, evidence-based practices; i-PARiHS, integrated-Promoting Action on Research Implementation in Health Services.
References
1.
Bauer MS Kirchner J . Implementation science: what is it and why should I care?J Med Libr Assoc. (2024) 112(3):281. 10.1016/j.psychres.2019.04.025
2.
Kilbourne A Chinman M Rogal S Almirall D . Adaptive designs in implementation science and practice: their promise and the need for greater understanding and improved communication. Annu Rev Public Health. (2023) 45(1):69–88. 10.1146/ANNUREV-PUBLHEALTH-060222-014438
3.
Almirall D Nahum-Shani I Wang L Kasari C . “Experimental designs for research on adaptive interventions: singly and sequentially randomized trials”. In: CollinsLMKuglerKC, editors. Optimization of Behavioral, Biobehavioral, and Biomedical Interventions. Cham: Springer International Publishing (2018). p. 89–120. 10.1007/978-3-319-91776-4_4
4.
Powell BJ Beidas RS Lewis CC Aarons GA McMillen JC Proctor EK et al Methods to improve the selection and tailoring of implementation strategies. J Behav Health Serv Res. (2017) 44(2):177–94. 10.1007/s11414-015-9475-6
5.
Almirall D Chronis-Tuscano A . Adaptive interventions in child and adolescent mental health. J Clin Child Adolesc Psychol. (2016) 45(4):383–95. 10.1080/15374416.2016.1152555
6.
Albers B Rapley T Nilsen P Clack L . Editorial: tailoring in implementation science. Front Health Serv. (2023) 3:1233597. 10.3389/FRHS.2023.1233597/BIBTEX
7.
Swindle T Rutledge JM Selig JP Painter J Zhang D Martin J et al Obesity prevention practices in early care and education settings: an adaptive implementation trial. Implement Sci. (2022) 17(1):1–16. 10.1186/S13012-021-01185-1/TABLES/1
8.
Whiteside-Mansell L Swindle TM . Together We Inspire Smart Eating: a preschool curriculum for obesity prevention in low-income families. J Nutr Educ Behav. (2017) 49(9):789–92.e1. 10.1016/j.jneb.2017.05.345
9.
Whiteside-Mansell L Swindle T Selig JP . Together, We Inspire Smart Eating (WISE): an examination of implementation of a WISE curriculum for obesity prevention in children 3 to 7 years. Glob Pediatr Health. (2019) 6:2333794X19869811. 10.1177/2333794X19869811
10.
Mustonen S Rantanen R Tuorila H . Effect of sensory education on school children’s food perception: A 2-year follow-up study. Food Qual Prefer. (2009) 20(3):230–40. 10.1016/j.foodqual.2008.10.003
11.
Anzman-Frasca S Savage JS Marini ME Fisher JO Birch LL . Repeated exposure and associative conditioning promote preschool children’s liking of vegetables. Appetite. (2012) 58(2):543–53. 10.1016/j.appet.2011.11.012
12.
Reverdy C Chesnel F Schlich P Köster EP Lange C . Effect of sensory education on willingness to taste novel food in children. Appetite. (2008) 51(1):156–65. 10.1016/j.appet.2008.01.010
13.
Knai C Pomerleau J Lock K McKee M . Getting children to eat more fruit and vegetables: a systematic review. Prev Med (Baltim). (2006) 42(2):85–95. 10.1016/j.ypmed.2005.11.012
14.
Wardle J Herrera ML Cooke L Gibson EL . Modifying children’s food preferences: the effects of exposure and reward on acceptance of an unfamiliar vegetable. Eur J Clin Nutr. (2003) 57(2):341–8. 10.1038/sj.ejcn.1601541
15.
Wardle J Chida Y Gibson EL Whitaker KL Steptoe A . Stress and adiposity: a meta-analysis of longitudinal studies. Obesity. (2011) 19(4):771–8. 10.1038/oby.2010.241
16.
Schindler JM Corbett D Forestell CA . Assessing the effect of food exposure on children’s identification and acceptance of fruit and vegetables. Eat Behav. (2013) 14(1):53–6. 10.1016/j.eatbeh.2012.10.013
17.
Hendy HM Raudenbush B . Effectiveness of teacher modeling to encourage food acceptance in preschool children. Appetite. (2000) 34(1):61–76. 10.1006/appe.1999.0286
18.
Hendy HM . Comparison of five teacher actions to encourage children’s new food acceptance. Ann Behav Med. (1999) 21(1):20–6. 10.1007/BF02895029
19.
Gibson EL Kreichauf S Wildgruber A Vögele C Summerbell CD Nixon C et al A narrative review of psychological and educational strategies applied to young children’s eating behaviours aimed at reducing obesity risk. Obes Rev. (2012) 13:85–95. 10.1111/j.1467-789X.2011.00939.x
20.
Galloway AT Fiorito LM Francis LA Birch LL . “Finish your soup”: counterproductive effects of pressuring children to eat on intake and affect. Appetite. (2006) 46(3):318–23. 10.1016/j.appet.2006.01.019
21.
Birch LL McPheee L Shoba BC Steinberg L Krehbiel R . “Clean up your plate”: effects of child feeding practices on the conditioning of meal size. Learn Motiv. (1987) 18(3):301–17. 10.1016/0023-9690(87)90017-8
22.
Borzekowski DL Robinson TN . The 30-second effect: an experiment revealing the impact of television commercials on food preferences of preschoolers. J Am Diet Assoc. (2001) 101(1):42–6. 10.1016/S0002-8223(01)00012-8
23.
Boyland E Harrold J Kirkham T Halford J . Persuasive techniques used in television advertisements to market foods to UK children. Appetite. (2012) 58(2):658–64. 10.1016/j.appet.2011.11.017
24.
Kraak VI Story M . Influence of food companies’ brand mascots and entertainment companies’ cartoon media characters on children’s diet and health: a systematic review and research needs. Obes Rev. (2015) 16(2):107–26. 10.1111/obr.12237
25.
Keller KL Kuilema LG Lee N Yoon J Mascaro B Combes AL et al The impact of food branding on children’s eating behavior and obesity. Physiol Behav. (2012) 106(3):379–8. 10.1016/j.physbeh.2012.03.011
26.
Roberto C Baik J Harris J Brownell K . Influence of licensed characters on children’s taste and snack preferences. Pediatrics. (2010) 126(1):88–93. 10.1542/peds.2009-3433
27.
Weber K Story M Harnack L . Internet food marketing strategies aimed at children and adolescents: a content analysis of food and beverage brand web sites. J Am Diet Assoc. (2006) 106(9):1463–6. 10.1016/j.jada.2006.06.014
28.
Whiteside-Mansell L Swindle TM . Evaluation of together we inspire smart eating: pre-school fruit and vegetable consumption. Health Educ Res. (2019) 34(1):62–71. 10.1093/her/cyy048
29.
Whiteside-Mansell L Swindle T Davenport K . Evaluation of “Together, We Inspire Smart Eating” (WISE) nutrition intervention for young children: assessment of fruit and vegetable consumption with parent reports and measurements of skin carotenoids as biomarkers. J Hunger Environ Nutr. (2019) 16(2):235–45. 10.1080/19320248.2019.1652127
30.
Swindle T Rutledge JM Martin J Curran GM . Implementation fidelity, attitudes, and influence: a novel approach to classifying implementer behavior. Implement Sci Commun. (2022) 3(1):1–14. 10.1186/S43058-022-00307-0
31.
Damschroder LJ . Clarity out of chaos: use of theory in implementation research. Psychiatry Res. (2019) 283:112461. 10.1016/J.PSYCHRES.2019.06.036
32.
Curry L Nembhard I Bradley E . Qualitative and mixed methods provide unique contributions to outcomes research. Circulation. (2009) 119:1442–52. 10.1161/CIRCULATIONAHA.107.742775
33.
Creswell JW Miller DL . Determining validity in qualitative inquiry. Theory Pract. (2000) 39(3):124–30. 10.1207/s15430421tip3903_2
34.
Saunders B Sim J Kingstone T Baker S Waterfield J Bartlam B et al Saturation in qualitative research: exploring its conceptualization and operationalization. Qual Quant. (2018) 52(4):1893–907. 10.1007/S11135-017-0574-8/TABLES/1
35.
Palinkas LA Zatzick D . Rapid assessment procedure informed clinical ethnography (RAPICE) in pragmatic clinical trials of mental health services implementation: methods and applied case study. Adm Policy Ment Health. (2018) 46(2):255–70. 10.1007/S10488-018-0909-3
36.
Gale RC Wu J Erhardt T Bounthavong M Reardon CM Damschroder LJ et al Comparison of rapid vs in-depth qualitative analytic methods from a process evaluation of academic detailing in the Veterans Health Administration. Implement Sci. (2019) 14(1):11. 10.1186/s13012-019-0853-y
37.
McHugh SM Riordan F Curran GM Lewis CC Wolfenden L Presseau J et al Conceptual tensions and practical trade-offs in tailoring implementation interventions. Front Health Serv. (2022) 2:974095. 10.3389/FRHS.2022.974095/BIBTEX
38.
Odom S . The tie that binds: evidence-based practice, implementation science, and outcomes for children. Topics Early Child Spec Educ. (2009) 29(1):53–61. 10.1177/0271121408329171
39.
Grady A Seward K Finch M Fielding A Stacey F Jones J et al Barriers and enablers to implementation of dietary guidelines in early childhood education centers in Australia: application of the theoretical domains framework. J Nutr Educ Behav. (2018) 50(3):229–37.e1. 10.1016/j.jneb.2017.09.023
40.
Markey K Macfarlane A Manning M . Time to re-envisage culturally responsive care: intersection of participatory health research and implementation science. J Adv Nurs. (2023) 79(11):4228–37. 10.1111/JAN.15821
41.
Metz A Bartley L . Active implementation frameworks for program success: how to use implementation science to improve outcomes for children. Zero Three. (2012) 32(4):11–8. Available online at:https://www.researchgate.net/profile/Leah-Bartley/publication/269949418_Active_Implementation_Frameworks_for_Successful_Service_Delivery_Catawba_County_Child_Wellbeing_Project/links/60e714110fbf460db8f23972/Active-Implementation-Frameworks-for-Successful-Service-Delivery-Catawba-County-Child-Wellbeing-Project.pdf
42.
Lyon AR Dopp AR Brewer SK Kientz JA Munson SA . Designing the future of children’s mental health services. Adm Policy Ment Health. (2020) 47(5):735–51. 10.1007/S10488-020-01038-X/TABLES/2
43.
Harvey G Kitson A . Facilitation as the active ingredient. In: HarveyGKitsonA, editors. Implementing Evidence-Based Practice in Healthcare: A Facilitation Guide. Oxon: Routledge/Taylor & Francis (2015). p. 169–84. Available online at: https://books.google.com/books?hl=en&lr=&id=x96TBwAAQBAJ&oi=fnd&pg=PP1&dq=harvey+%26+kitson+2015&ots=TiainCwmlP&sig=omDrSguLC9lO- (Accessed December 10, 2025).
44.
Harvey G Kitson A . PARIHS revisited: from heuristic to integrated framework for the successful implementation of knowledge into practice. Implement Sci. (2016) 11(1):33. 10.1186/s13012-016-0398-2
45.
Ritchie MJ Parker LE Kirchner JE . From novice to expert: a qualitative study of implementation facilitation skills. Implement Sci Commun. (2020) 1(1):1–12. 10.1186/S43058-020-00006-8
46.
Kidwell KM Hyde LW . Adaptive interventions and SMART designs. Am J Eval. (2016) 37(3):344–63. 10.1177/1098214015617013
47.
Kilbourne AM Smith SN Choi SY Koschmann E Liebrecht C Rusch A et al Adaptive school-based implementation of CBT (ASIC): clustered-SMART for building an optimized adaptive implementation intervention to improve uptake of mental health interventions in schools. Implement Sci. (2018) 13(1):119. 10.1186/s13012-018-0808-8
48.
Kilbourne A Almirall D . Enhancing outreach for persons with serious mental illness: 12-month results from a cluster randomized trial of an adaptive implementation strategy. Implement Sci. (2014) 9(1):163. 10.1186/s13012-014-0163-3
49.
Swindle T McBride NM Selig JP Johnson SL Whiteside-Mansell L Martin J et al Stakeholder selected strategies for obesity prevention in childcare: results from a small-scale cluster randomized hybrid type III trial. Implement Sci. (2021) 16(1):48. 10.1186/s13012-021-01119-x
50.
Taylor B Henshall C Kenyon S Litchfield I Greenfield S . Can rapid approaches to qualitative analysis deliver timely, valid findings to clinical leaders? A mixed methods study comparing rapid and thematic analysis. BMJ Open. (2018) 8(10):e019993. 10.1136/bmjopen-2017-019993
51.
Swindle T Selig JP Rutledge JM Whiteside-Mansell L Curran G . Fidelity monitoring in complex interventions: a case study of the WISE intervention. Arch Public Health. (2018) 76(1):53. 10.1186/s13690-018-0292-2
Summary
Keywords
adaptive implementation, end-user feedback, community-engaged dissemination and implementation, early care and education, implementation science
Citation
Massey-Swindle T, Rutledge JM, Johnson SL and Curran GM (2025) Using end user feedback to specify an adaptive implementation strategy. Front. Health Serv. 5:1702190. doi: 10.3389/frhs.2025.1702190
Received
09 September 2025
Revised
14 November 2025
Accepted
14 November 2025
Published
17 December 2025
Volume
5 - 2025
Edited by
Sabi Redwood, University of Bristol, United Kingdom
Reviewed by
Elizabeth Chen, University of North Carolina at Chapel Hill, United States
Samuel Cumber, University of the Free State, South Africa
Updates
Copyright
© 2025 Massey-Swindle, Rutledge, Johnson and Curran.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
* Correspondence: Taren Massey-Swindle tswindle@uams.edu
Disclaimer
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.