OPINION article
Front. Health Serv.
Sec. Implementation Science
Volume 5 - 2025 | doi: 10.3389/frhs.2025.1704368
Does the "17-year gap" tell the right story about implementation science?
Provisionally accepted- 1University of Alberta, Edmonton, Canada
- 2University of Calgary, Calgary, Canada
Select one of your emails
You have multiple emails registered with Frontiers:
Notify me on publication
Please enter your email address:
If you already have an account, please login
You don't have a Frontiers account ? You can register here
The challenges of implemen9ng and sustaining evidence-based organiza9onal and system changes have been well documented. Over the past three decades, the field of implementa9on science (IS) -the scien9fic study of methods and strategies facilita9ng the uptake of evidencebased prac9ce and research into regular use by prac99oners and policymakers -has emerged to support this work (1). IS is now a rich field with specialized journals that publish increasingly sophis9cated, mul9-disciplinary research (2). Scholars have developed theories, models and frameworks (TMFs) based on insights from mul9ple disciplines to address the complex range of factors affec9ng the uptake of interven9ons. The value of these TMFs has been recognized in implementa9on planning related to pressing issues such as adap9ng to the health impacts of climate change, eradica9ng polio and addressing global health inequity (3)(4)(5). The three authors of this commentary are IS researchers and prac99oners. In these roles, we observe IS applied across a wide range of public health and healthcare implementa9on processes in our home province of Alberta, Canada. We have no9ced that funders, research teams and health-system teams ofen jus9fy the importance of IS-based planning by sta9ng that there is a "17-year gap" between the genera9on of research evidence and its applica9on in prac9ce and policy (see for example (6)). This figure originates from a 2000 ar9cle by E. A. Balas and S. A. Boren (7). The concept of this gap is widely prevalent in the IS literature. While wri9ng this commentary, we checked PubMed and found that the Balas and Boren ar9cle had been cited 2,237 9mes by that point (April 2025). We retrieved the 135 English-language ar9cles published in 2024 and early 2025 which cited this paper and classified them according to how the ar9cle was cited. Over half (n=77) of the author teams cited the ar9cle with specific reference to the 17-year gap, using language such as the lag 9me of adop9on of new evidencebased treatments is "currently es9mated to be approximately 17 years" (8), "It takes about 17 years for research evidence to get to clinical prac9ce," (9) or, "Successful transla9on from scien9fic discoveries to implementa9on in clinical prac9ce and public health takes on average 17 years" (10). A further 28 author teams cited the paper in support of the premise that implementa9on takes an excessively long 9me, without explicitly ci9ng the 17-year figure. The ofen-unstated assump9ons behind quo9ng this ar9cle are, first, that 17 years is too long a 9meframe for realizing the benefits of research, and, second, that "pujng evidence into prac9ce" is a straighkorward concept that doesn't require more nuanced conceptualiza9on. In this commentary, we argue that incorpora9ng references to the 17-year gap in ar9cles and presenta9ons, rather than illumina9ng the need for IS, actually obscures the present state of the field and the challenges of co-crea9ng and sustaining change in complex systems (11). Arguments Argument 1: The cita1on doesn't support the 17-year figure. The Balas and Boren ar9cle cited in support of the 17-year gap actually presents a much more nuanced picture of the 9me it takes to move research into prac9ce -and of measuring what moving research into prac9ce actually means. The authors first present the findings of several studies of the 9me taken to move evidence into prac9ce; the average 9me was 17 years, but the figures in the included studies varied by clinical specialty and by the defini9on of what "moving evidence into prac9ce" actually means. The second half of the paper presents the authors' own analysis of the 9me required across nine clinical disciplines to achieve a 50% rate of clinical use of findings from a landmark study. They found an average annual increase of 3.2 percent across nine clinical areas, which gave an average of 15.6 years from publica9on of the landmark study to a 50% u9liza9on rate. This was broken down into 6.3 years for evidence to reach reviews, papers and textbooks and 9.3 years to implement the findings into prac9ce. In short, what this paper supports is not the blanket statement "we know it takes 17 years," but, in reality, a much more nuanced, context-sensi9ve picture in which the 9me required to move evidence into prac9ce varies according to clinical specialty, implementa9on fidelity, local context and the chosen benchmark. Argument 2: Twenty-five-year-old evidence is not relevant in today's world. Bri9sh novelist L.P. Hartley famously wrote, "The past is a foreign country; they do things differently there" (12). Implementa9on scien9sts think long and hard about the importance of context. The context of 2025, in terms of factors influencing diffusion and implementa9on, is vastly different than that of the year 2000. When Balas and Boren's ar9cle was published, the first iPhone had yet to come on the market, Neklix s9ll made its money snail-mailing DVDs to customers, and social media as we currently know it, with its power to spread messages far and wide, didn't exist -Facebook was launched in 2004, and Twiqer in 2006. Jonathan Lomas' landmark editorial on what was then called knowledge transfer had only just been published (13), the concept of health-related knowledge brokerage was in its infancy, and the publica9on dates of important implementa9on science frameworks such as the Consolidated Framework for Implementa9on Research and the Explora9on, Prepara9on, Implementa9on and Sustainment framework were years in the future (2009 and 2011 respec9vely). In 2025, implementers have dozens of IS TMFs to choose from (14). Furthermore, we are benefi9ng from the rise of numerous implementa9on support structures and specialists. Two Canadian examples, both supported by funding from the Canadian Ins9tutes of Health Research, are the Health System Impact Program, which provides embedded research opportuni9es for PhD students, postdoctoral fellows and early career researchers, and the provincial and territorial SPOR SUPPORT Units, which provide local decision-makers and health system staff with support for learning health systems, such as improved data access and implementa9on science exper9se. Many countries are suppor9ng ini9a9ves such as partnerships between academic and healthcare organiza9ons, embedded researcher posi9ons, and intermediary organiza9ons and programs, all of which create precondi9ons for accelera9ng implementa9on. Argument 3: We haven't established the op1mal pace of implementa1on. If 17 years is too long to implement change, what is the right 9meframe? Somewhat surprisingly, un9l recently, there has been liqle research on the op9mal pace of implementa9on and the factors that affect its speed, including changing poli9cal environments, organiza9onal readiness, and the capacity of systems (15). Interven9on characteris9cs and implementa9on context maqer. Speedy implementa9on is more likely for low-risk interven9ons supported by strong evidence, and/or in contexts with high readiness and where important rela9onal work has been done (15). Key components such as building trust and credibility between partners, aqending to health equity, and iden9fying local champions -work that helps sustain interven9ons and programs in prac9ce -all take 9me (16). In addi9on, "strategic delay" of implementa9on may some9mes allow space for clarifica9on and necessary adapta9ons that will improve quality and sustainment in the long run (17). And faster may not always be beqer; while the COVID-19 pandemic saw accelerated produc9on and uptake of new evidence, it also highlighted the dangers of implemen9ng unproven cures, such as ivermec9n, without a mature evidence base. The IS field will benefit from more work on the challenges and uncertain9es associated with accelera9ng implementa9on efforts, with aqen9on paid to the fact that implementa9on and sustainment are ofen slower in disadvantaged, security-challenged or mistruskul communi9es or in sejngs with under-resourced health systems (15,18). Argument 4: We have a much more interes1ng story to tell now. In a resource-constrained world with pressing social ills, every failed implementa9on is a missed opportunity to benefit people and communi9es. Implementa9on science grew out of the recogni9on that successful implementa9on and sustainment are hard work. There is no straight line between designing a policy or program, and seeing it taken up by the people, organiza9ons and systems that will make the necessary changes happen. As Rapport and colleagues note, "what appears to be a linear process is contested, challenging, tortuous, and poli9cal -governed more by the laws of complexity and chaos than those inherent in straight-line, formulaic models" (19). With this understanding, jus9fying our work with statements about 17 years to "move evidence into prac9ce" masks the complexity of the research-prac9ce ecosystem. We have presented our case for why IS researchers and prac99oners may wish to reconsider ci9ng the 17-year gap as a jus9fica9on for the field. The ar9cle cited in support of this 9meframe is now 25 years old and reflects a world that no longer exists. Suppor9ng the adop9on of evidence-based interven9ons is not a linear process -the factors that accelerate or inhibit implementa9on of any given ini9a9ve are influenced by a wide range of contextual factors within complex research-prac9ce ecosystems. IS theories, models, and frameworks largely originated in affluent Western sejngs. In recent years, however, innova9ve scholarship has pushed the field to be more reflec9ve of global health. To take just two of many possible examples, Harding et al have developed the He Pikinga Waiora implementa9on framework to support the analysis of implementa9on effec9veness in Indigenous communi9es in Aotearoa New Zealand (20), and Means et al have suggested a modifica9on of CFIR -adding the domain of characteris9cs of systems -to beqer reflect the decentralized nature of health systems in many LMICs (21). Given this evolu9on, ci9ng the supposed 17-year delay between the genera9on and applica9on of evidence, based on work conducted many years ago, no longer reflects the complexity, nuance, or evolu9on of our field. We stated earlier that authors cite the 17-year figure as a pithy and readily understood jus9fica9on for IS -a statement easily incorporated into ar9cles, reports and presenta9ons. Readers of this ar9cle, convinced (we hope) by our argument, may jus9fiably wonder if we have an alterna9ve to propose. We suggest an approach that underscores IS' poten9al to improve people's lives in all parts of the globe: quo9ng the United Na9ons' Universal Declara9on of Human Rights, Ar9cle 27, that, "Everyone has the right…to share in scien9fic advancement and its benefits" (22). We read this as not only a call for equitable access to the benefits of science, but also a call for more equitable recogni9on of the voices and knowledges that cons9tute science around the world. In other words, the ul9mate aim of implementa9on is not speed alone, but ensuring that scien9fic knowledge is mobilized to advance health and wellbeing for all. Tackling the impera9ve that all people experience benefits from research requires careful design of implementa9on plans that include strategies and mechanisms robust enough to support the promise of data-informed learning health systems (23). If, afer twenty-five years, the 17-year figure is s9ll rou9nely cited, then we must ask ourselves whether implementa9on science is truly bridging the evidence-prac9ce gap, or simply re-inscribing it. We call on scholars, prac99oners, and educators to reconsider the 17-year trope and instead illuminate the real and diverse challenges of implementa9on -challenges that require thoughkul, context-sensi9ve, and 9me-responsive approaches, drawing on current scholarship reflec9ng global perspec9ves.
Keywords: implementation, implementation science, Implementation practice, speed, Evidence-Based Practice
Received: 12 Sep 2025; Accepted: 16 Oct 2025.
Copyright: © 2025 Thomson, Zimmermann and Montesanti. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
* Correspondence: Denise Thomson, dthomson@ualberta.ca
Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.