About this Research Topic
For five decades research has studied how humans reason under uncertainty and how they make decisions about risk. When people are required to make a rapid, intuitive inference about a situation they are familiar with – for instance, a firefighter deciding whether to enter a burning building based on environmental clues – they demonstrate good risk reasoning skills. However, when people are asked to make abstract, data-based solutions, they struggle. Probabilistic reasoning is challenging as it involves high-level cognitive processes, but it is important for dealing with rare or previously unseen situations. The discrepancy between intuitive risk reasoning and conscious, top-down, probabilistic reasoning – and how the former could potentially support the latter – requires further investigation. A particularly important area for both research and application is communication in the media about research data, in terms of policymakers explaining the basis of their decisions, and individuals using information to inform their own behaviour.
Risk perception is influenced by numerous potential biases, including our emotional response to an event, our knowledge about an event or its consequences, and the scale of the consequences (e.g. how many people would be affected if the event actually occurred?). Intuitive risk reasoning, based on expertise (such as that of the firefighter), may override cognitive biases by tapping into knowledge representations formed through repeated exposure. However, when an event is rare or falls outside our area of expertise (such as evaluating the risk of a pandemic), reasoning may be overwhelmed by biases and the actual evidence undervalued.
The way the media communicate information about uncertainty and risk can greatly affect how people use it to form judgements. In the era of fake news and conspiracy theories, where controversial or inaccurate information about current topics (such as COVID-19, 5G and vaccination) spreads rapidly via social media, the ability to correctly reason under uncertainty is becoming crucial. The goal of this Research Topic is to bring together evidence that will elucidate how information about risk can be effectively communicated in the public realm and help counteract the spread of misinformation.
This Research Topic will examine why the public reason inconsistently with the information provided by experts and authorities, and rely instead on fake news and conspiracy theories.
Key questions include:
• Is misinformation preferred because it is simplistic and therefore easier to understand than complex expert opinions?
• How is risk communication affected by the visual or verbal presentation of information?
• What is the role of trust and familiarity?
• How can the “structure” of risk information be broken down so that one can “correct” its communication?
Tackling these issues requires bringing together different approaches, and we therefore welcome diverse perspectives on risk perception and reasoning under uncertainty. We particularly encourage studies of how visualization and infographics can be used to support cognitive processing, studies that link neuroscience data with reasoning processes, and studies conducted with real world data, including social media.
Keywords: risk perception, probabilistic reasoning, uncertainty, decision making, social media, communication, information visualization, policy, fake news
Important Note: All contributions to this Research Topic must be within the scope of the section and journal to which they are submitted, as defined in their mission statements. Frontiers reserves the right to guide an out-of-scope manuscript to a more suitable section or journal at any stage of peer review.