PERSPECTIVE article

Front. Commun., 23 January 2020

Sec. Science and Environmental Communication

Volume 4 - 2019 | https://doi.org/10.3389/fcomm.2019.00078

Evidence-Based Science Communication

  • 1. Department of Sociology, University of Warwick, Coventry, United Kingdom

  • 2. Department of Science Communication, Rhine-Waal University of Applied Sciences, Kleve, Germany

Article metrics

View details

84

Citations

47,8k

Views

6,8k

Downloads

Abstract

Effective science communication can empower research and innovation systems to address global challenges and put public interests at the heart of how knowledge is produced, shared, and applied. For science communication to play this mediating role effectively, we propose a more integrated and “evidence-based” approach. This commentary identifies key issues facing the science communication field. It suggests a series of prescriptions, inspired by the impact of “evidence-based medicine” over the past decades. In practice, evidence-based science communication should combine professional expertise and skills with the best available evidence from systematic research. Steps required to achieve this outcome include more quality assurance in science communication research, significant changes in teaching and training, and improved interfaces between science communication research and practice.

At its best, science communication can empower research and innovation systems to address global challenges, by improving the relationships with stakeholders in policy, industry, and civil society (see “Quadruple Helix,” e.g., Carayannis and Campbell, 2009, e.g., 2018ff). Science communication can put public interests at the heart of how knowledge is produced, shared, and applied today, thereby enhancing the benefits of science and technology and mitigating their limitations or risks. Moreover, effective science communication can facilitate the role of research and innovation in developing a more sustainable world. Therefore, it is imperative that science communication plays its mediating role effectively. This view of science communication's value inspires our call in this essay to open a dialogue about integrating science communication research and practice within a new vision for “evidence-based science communication.”

It has now been decades since the notion of “evidence-based medicine” gained a foothold in scholarly discourse. In this commentary, we argue that the field of science communication faces challenges that would benefit from some of the prescriptions that evidence-based medicine offers, in particular, with the aim of helping research and practice take each other's experiences and insights fully into account. This evolution is essential to drive real progress in science communication as a field of practice.

Key Challenges

Science communication today is expected to go far beyond making scientific knowledge more accessible to lay audiences. For example, ambitious notions about science communications potential role can be identified in the European policy prescription of “Responsible Research and Innovation” (RRI) or efforts to include stakeholders earlier in technology assessment and regulatory processes to establish a more “social” innovation (Phills et al., 2008, e.g., p. 39ff). With the growing expectations of 21st science communication, it also becomes increasingly important for this field to be more self-reflective and demonstrably effective. This commentary presents our view of these challenges across both science communication research and practice based on our experience in this field.

Key challenges underpinning this commentary are identified in the first empirical gap analysis for the field of science communication research (Gerber et al., 2020, p. 61ff), in particular the following: (i) to build a research corpus with effective transfer mechanisms, so that science communication practitioners can apply research in their work practice, and perhaps even investigate in collaboration with scholars the applicability of potentially useful strategies; (ii) to widen the spectrums of science communication research topics and methods, in particular by extending the existing methodological toolkit in science communication to include more longitudinal and experimental research. Experts contributing to a Delphi study in this science communication research field analysis emphasized that neither scholarship nor practice adequately take account of the other side's priorities, needs and possible solutions: This can be understood as a double-disconnect between research and practice (Gerber et al., 2020, e.g., p. 4).

Both authors of this essay have worked in science communication practice and research, and especially at the interface between the two domains over many years in this evolving field. In this time, we have seen many challenges that trouble the research/practice interface in science communication (e.g., see Fischhoff, 2013, e.g., p. 14038). Many of these challenges have been raised in one form or another in empirical studies of science communication research and practice (e.g., Holliman and Jensen, 2009; Gerber, 2014; Jamieson et al., 2017; Gerber et al., 2020). Ironically, the challenges begin with communication about science communication evidence (see Table 1). The framework suggested here, based on our experience, addresses four usually sequential steps of a “Knowledge Cascade,” which is addressed on four levels, namely Relevance, Accessibility, Transferability, and Quality assurance.

Table 1

Research Challenge Practice
  • Few scholarly publications in science communication either attempt or succeed in conveying clearly why and to whom the results matter in practice.

  • There are hardly any systematic reviews for specific topics/challenges within science communication to distill the best available evidence in a methodologically robust way.

  • Research would also benefit from more direct input from practice about challenges and needs.

Determining the relevance of evidence
  • Most practitioners are neither aware of the existing science communication evidence, nor do they consider whatever research they know about to be relevant enough to be worth the investment of time in seeking more information.

  • Using evidence in science communication practice, for example, by integrating impact evaluation, requires reflexivity and a willingness to reconsider established practices in light of the best available evidence.

Once acknowledged…
  • Publishing results behind journal paywalls and predominantly only in English disadvantages both researchers and practitioners (especially in non-English speaking and also low-income countries).

  • Open data and open methodology are still rarely applied in science communication research. Journal articles and evaluation reports lack relevant methodological details such as measurement instruments and details about how analyses were conducted.

  • The multidisciplinary field suffers from inconsistent terminology, making literature reviews and identification of relevant evidence unnecessarily difficult.

Making relevant evidence accessible
  • Knowledge is dispersed across hundreds of journals, many of which are closed access.

  • Developing understanding of relevant evidence and producing new evidence through evaluation requires know-how that is often inadequately developed in science communication teaching/training for practitioners.

  • Time constraints and different institutional priorities may require more top-down prescriptions, for instance by making systematic impact assessment a funding requirement and provision of standardized, methodologically sound, freely accessible evaluation tools.

Once accessed…
  • Few research funding schemes incentivize collaborative research between researchers and practitioners. More broadly, there are limited funding opportunities for transdisciplinary research to directly apply/test research results in practice.

  • Science communication research is often driven by academic concerns within philosophy/sociology/history of science, not necessarily looking at practical applicability as a priority.

Enhancing the transferability of accessible evidence
  • Even if aware and interested, practitioners are discouraged from exploring the world of science communication research any further by (social) scientific jargon.

  • Even if the research clearly specifies practice implications, practitioners are basically left alone to implement those findings and to understand their relative importance for their work.

Once transferred…
  • Because many science communication researchers come to the field after completing higher education qualifications in natural and physical sciences rather than social sciences, some have not yet developed the methodological expertise necessary to design robust social research (Martin, 2019).

  • Funders rarely insist on methodological quality in science communication evaluation, which means most external evaluation reports conducted by consultants offer questionable or limited value as evidence.

  • Researchers in social sciences are not required to register their studies or evaluations, which is a common standard in medical sciences, or to follow other practices at the leading edge of transparent “open science.”

Relying on quality-assured transferable knowledge
  • Depending on the journal and peer reviewers' backgrounds, there are major differences in the level of methodological rigor expected during the peer review process. This means practitioners may be expected to sift through or use unreliable evidence if they are left to assess individual articles without the benefit of systematic reviews or meta-analyses.

  • Science communicators are rarely provided with support and guidance in how to access and use the best available evidence during their training. Moreover, many practice-oriented conferences disregard quality of evidence as a priority, treating personal impressions and anecdotes at the same level as robust evidence.

  • Science communicators are often left to their own devices to design and conduct empirical evaluations, with limited training and support by their institutions or funders, and often without in-house experts to call upon for advice.

The science communication knowledge cascade: key challenges at the interfaces between research and practice.

It is both self-evident and revealing that there is limited empirical evidence that speaks to the generalizations and truth claims presented in the table above based on our practical experience across the research-practice divide in science communication. We think the sparse research available on these topics highlights the need for more evidence-based integration and mutual learning to more systematically clarify the state of play.

Beyond strengthening the links between research and practice and establishing additional opportunities for knowledge exchange and collaboration, there are numerous challenges at a practical level to implementing evidence-based approaches. These challenges run deep, with barriers embedded in science communication training, norms and values that drive practice (e.g., see Jensen and Holliman, 2016).

Evidence-Based Science Communication (EBSC): Pathways Forward

A classic editorial in the British Medical Journal set out to clarify the direction that was being advocated for the field of medicine in an article entitled: “Evidence based medicine: what it is and what it isn't.” We would adopt a similar account for defining “evidence-based science communication” as a viable pathway forward. To adapt the language used by Sackett et al. (1996), p. 71, we are advocating the “conscientious, explicit, and judicious use of current best evidence in making decisions” about science communication. In practice, evidence-based science communication involves combining professional expertise and skills with the best available evidence from systematic research, underpinned by established theory. By professional expertise we mean the “proficiency and judgment” that individual science communication practitioners acquire through experience and practice, refined over time through empirical evaluation (cf. Sackett et al., 1996, p. 71). There are numerous indicators of such professional expertise in science communication, including:

  • Applying social science research and theory when designing science communication activities to avoid well-known pitfalls and improve the odds of success.

  • Planning, developing, and applying objectives in a logical way to address the needs of specific stakeholders or audiences.

  • Following good ethical principles including informed consent for participation and responsible data protection and management.

  • Being open and transparent about the nature of the funding, organizations involved and influences on the design of science communication activities

  • Ensuring that appropriate and relevant communication skills are developed and applied for a given science communication challenge.

  • Being inclusive and welcoming of those who are often marginalized or excluded, both in the development and delivery of science communication activities.

  • Willingness and capability to reflect on limitations in one's own communication objectives and strategies despite institutional constraints and agendas, even if this may invalidate previously accepted practices.

  • Committing to continually improve practice based on ongoing collection and analysis of evaluation evidence (Jensen, 2014, 2015a).

  • Being learning-oriented, focusing on continual professional improvement and sharing of new findings to aid others.

  • Working to make any given science communication activity as resource efficient as possible to ensure that opportunities for positive impact are not squandered.

It will be clear from the points above that we believe that “using robust social scientific evidence […] to ensure success should be viewed as a basic necessity across the sector” (Jensen, 2015b, p. 13). Applying well-established principles of good communication (e.g., Spitzberg, 1983) should be a basic expectation of science communication practice for professionals and their funders.

Just as in evidence-based medicine, EBSC must be expected to “invalidate previously accepted” practices and “replace them with new ones that are more powerful, more accurate, more efficacious” (Sackett et al., 1996, p. 71). What counts as effective science communication practice depends on the institutional, local and cultural context. The nature of the science communication evidence base and how to define satisfactory evidence is a matter that requires elaboration aimed at the research community in science communication, which we will develop in a separate essay. Here, we wish to emphasize that science communication research should be providing relevant, accurate, and timely insights that practitioners can use. Indeed, the issues we wish to raise are not only about a deficit of evidence in practice, but also a lack of sufficient applicability, mutual appreciation and collaboration, explained in more detail below (inspired by Heneghan et al., 2017).

Evidence-Based Science Communication (EBSC)
  • Evidence-based practice: Increase the systematic use of evidence in science communication practice to maximize effectiveness and forestall negative impacts.

  • Evidence-based research: Reduce questionable science communication research practices, avoid preventable methodological shortcomings and increase transparency.

  • Assessing impact: Make impact evaluation of science communication a standard expectation in communication and engagement funding with the aim of refining practices based on findings.

  • Bridging the chasm: Address the divides between research and practice in science communication along the entire Knowledge Cascade (see above) to enable an integrated evidence-based practice.

  • Mutual appreciation and collaboration: Develop initiatives to encourage both researchers and practitioners to develop mutual understanding about their needs, experiences and unique capabilities and forms of expertise.

  • Establish more effective mechanisms for exchange that work for practitioners and researchers that transcend the limitations of scholarly publishing.

  • Recognizing applicability: Where research results and theory can be tested in real world situations, both research and practice need incentives to engage and collaborate. More applied, or at least practice-relevant, research also requires more systematic analysis of the needs for research from the perspective of science communication practice.

  • Collaboration: Instead of trying to merely transfer abstract expert knowledge into practice, the science communication field needs more transdisciplinary means of collaboratively investigating and optimizing science communication from within, using real-world data to develop both research and practice through the same initiatives without compromising quality standards on either side.

  • Revisit the raison d'être for science communication: Promote important societal values such as social inclusion, good ethical practices and democratic participation through the design of science communication initiatives.

  • Systematic reviews: Produce practical guidelines to effectively inform and orient practice by distilling the best available evidence in a methodologically robust way. This should also foster replicability and replication for key topics by making methodological transparency the norm.

  • Systemic change: Encourage informed decision-making in the selection of science communication approaches for particular settings and circumstances, backed up by funding review processes that insist on evidence-informed approaches.

  • Certification: Encourage the next generation of leaders in evidence-based science communication through certification processes and standards in teaching and training.

Conclusion

We fully recognize that our diagnosis of the problem and perspective on pathways forward will face criticism. Some of that criticism may fall along the lines of prior critiques of evidence-based medicine, including the idea that evidence-based science communication is “old hat,” a “dangerous innovation,” “perpetrated by the arrogant,” and a move to “suppress” science communicators” or researchers' professional “freedom” (Sackett et al., 1996, p. 73). Clearly “evidence” in science communication and beyond will always be contested and provisional, but it nevertheless provides the strongest pragmatic basis for making improvements in practice.

We need to have this debate as a field, including practitioners, researchers and those–like the two of us–that work across these two domains. This commentary is meant to cultivate reflexivity in our community by initiating a discussion about the value, quality, and effectiveness of what we are practicing and researching. Many of the questions posed in and even resulting from this commentary are expected to trigger a discussion about fundamental principles and practices in our field. At the same time, however, we also hope that general issues, such as querying how relevant research should be expected to be for practice, will not overshadow the very concrete issues we are raising about how to use existing evidence and experience on both sides to empower science communication to live up to its potential in the interest of a world that desperately needs it more than ever. This is also why this commentary does not attempt to provide easy solutions but instead welcomes and explicitly invites dialogue about the pathways forward for our field.

Statements

Author contributions

All authors listed have made a substantial, direct and intellectual contribution to the work, and approved it for publication.

Acknowledgments

The authors are deeply grateful for the reflexivity provoked in the long process of developing this commentary by numerous inspiring discussions with friends and colleagues working in science communication research and practice around the world.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest. The reviewers, JR and BW, declared a past collaboration with one of the authors, EJ, to the handling editor.

References

  • 1

    Carayannis E. G. Campbell D. F. J. (2009). 'Mode 3' and 'Quadruple Helix': toward a 21st century fractal innovation ecosystem. Int. J. Technol. Manage.46, 201234. 10.1504/IJTM.2009.023374

  • 2

    Fischhoff B. (2013). The science of science communication. Proc. Natl. Acad. Sci. U.S.A.110, 1403114032. 10.1073/pnas.1312080110

  • 3

    Gerber A. (2014). Science caught flat-footed: how academia struggles with open science communication, in Opening Science – The Evolving Guide on How the Internet is Changing Research, Collaboration and Scholarly Publishing, eds BartlingS.FriesikeS. (Wiesbaden: Springer), 7380. 10.1007/978-3-319-00026-8_4

  • 4

    Gerber A. Metcalfe J. Broks P. Lorke J. Gabriel M. Lorenz L. (2020). Science Communication Research: An Empirical Field Analysis (Government Report). German Federal Ministry of Education and Research.

  • 5

    Heneghan C. Mahtani K. R. Goldacre B. Godlee F. Macdonald H. Jarvies D. (2017). Evidence based medicine manifesto for better healthcare: a response to systematic bias, wastage, error and fraud in research underpinning patient care. Evid. Based Med.22, 120122. 10.1136/ebmed-2017-j2973rep

  • 6

    Holliman R. Jensen E. (2009). (In)authentic science and (im)partial publics: (Re)constructing the science outreach and public engagement agenda, in Investigating Science Communication in the Information Age: Implications for Public Engagement and Popular Media, eds HollimanR.WhiteleggE.ScanlonE.SmidtS.SamS.ThomasJ. (Oxford: Oxford University Press), 3552.

  • 7

    Jamieson K. H. Kahan D. Scheufele D. A (eds.). (2017). The Oxford Handbook of the Science of Science Communication. Oxford: Oxford University Press.

  • 8

    Jensen E. (2014). The problems with science communication evaluation. J. Sci. Commun.13:C04. 10.22323/2.13010304

  • 9

    Jensen E. (2015a). Evaluating impact and quality of experience in the 21st century: using technology to narrow the gap between science communication research and practice. J. Sci. Commun.14:C05. 10.22323/2.14030305

  • 10

    Jensen E. (2015b). Highlighting the value of impact evaluation: enhancing informal science learning and public engagement theory and practice. J. Sci. Commun.14:Y05. 10.22323/2.14030405

  • 11

    Jensen E. Holliman R. (2016). Norms and values in UK science engagement practice. Int. J. Sci. Educ. B Commun. Public Engage.6, 6888. 10.1080/21548455.2014.995743

  • 12

    Martin V. Y . (2019), Four common problems in environmental social research undertaken by natural scientists. BioScience128. [Epub ahead of print]. 10.1093/biosci/biz128

  • 13

    Phills J. A. Jr. Deiglmeier K. Miller D. T. (2008). Rediscovering social innovation. Stanford Social Innovation Review. 6, 34.

  • 14

    Sackett D. Rosenberg W. M. C. Gray J. A. M. Haynes R. B. Richardson W. S. (1996). Evidence based medicine: what it is and what it isn't. BMJ312, 7173. 10.1136/bmj.312.7023.71

  • 15

    Spitzberg B. H. (1983). Communication competence as knowledge, skill, and impression. Commun. Educ.32, 323329. 10.1080/03634528309378550

Summary

Keywords

public engagement with research, public understanding of science (PUS), public communication of science and technology, divulgación científica, divulgação científica, science communication

Citation

Jensen EA and Gerber A (2020) Evidence-Based Science Communication. Front. Commun. 4:78. doi: 10.3389/fcomm.2019.00078

Received

21 November 2019

Accepted

31 December 2019

Published

23 January 2020

Volume

4 - 2019

Edited by

Tarla Rai Peterson, The University of Texas at El Paso, United States

Reviewed by

Jessica Norberto Rocha, Fundação CECIERJ, Brazil; Brady Wagoner, Aalborg University, Denmark

Updates

Copyright

*Correspondence: Eric A. Jensen

This article was submitted to Science and Environmental Communication, a section of the journal Frontiers in Communication

†These authors have contributed equally to this work

Disclaimer

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.

Outline

Cite article

Copy to clipboard


Export citation file


Share article

Article metrics