- 1Institute for Psychology in Education and Instruction, Department of Psychology and Sport Studies, University of Münster, Münster, Germany
- 2Institute of Educational Studies, Faculty of Humanities and Social Sciences, Humboldt University of Berlin, Einstein Center Digital Future, Berlin, Germany
- 3Department of Teacher Education, Lurie College of Education, San Jose State University, San Jose, CA, United States
- 4School of Sciences, University of Central Lancashire, Larnaka, Cyprus
Many urgent problems that societies currently face—from climate change to a global pandemic—require citizens to engage with scientific information as members of democratic societies as well as to solve problems in their personal lives. Most often, to solve their epistemic aims (aims directed at achieving knowledge and understanding) regarding such socio-scientific issues, individuals search for information online, where there exists a multitude of possibly relevant and highly interconnected sources of different perspectives, sometimes providing conflicting information. The paper provides a review of the literature aimed at identifying (a) constraints and affordances that scientific knowledge and the online information environment entail and (b) individuals' cognitive and motivational processes that have been found to hinder, or conversely, support practices of engagement (such as critical information evaluation or two-sided dialogue). Doing this, a conceptual framework for understanding and fostering what we call online engagement with scientific information is introduced, which is conceived as consisting of individual engagement (engaging on one's own in the search, selection, evaluation, and integration of information) and dialogic engagement (engaging in discourse with others to interpret, articulate and critically examine scientific information). In turn, this paper identifies individual and contextual conditions for individuals' goal-directed and effortful online engagement with scientific information.
Introduction
Socio-scientific issues—from climate change to the ongoing COVID-19 pandemic (we will use the latter issue as an example in this article)—hold many consequences for personal, social, and civic life (Feinstein and Waddington, 2020). For such issues, defining the problem as well as coming up with possible solutions often rests on knowledge and evidence from the natural but also from the social sciences, which are well-beyond most citizens expertise (Zeidler, 2014). Nonetheless, most citizens want and need to stay informed and will likely seek information online, as searching for information on specific science-related issues is usually done on the Internet (National Science Board, 2018). In recent years, the percentage of people who use the Internet to learn about science has substantially increased, and there, they encounter a wide variety of digital media formats, including social media (Pavelle and Wilkinson, 2020). In this article, we review literature on the cognitive and motivational processes underlying online engagement with scientific information (OESI) that individuals employ in order to utilize the affordances and overcome the challenges of searching for and dealing with scientific information in online information environments.
“Engagement” is an elusive concept but has been conceptualized as a behavioral manifestation of motivation or productive participation in a learning activity (e.g., Eccles and Wang, 2012; Bråten et al., 2018). Similar to previous models of engagement (Guthrie and Klauda, 2016), we understand OESI as goal-directed (that is, directed at achieving epistemic aims) and effortful activity in dealing with scientific information in online information environments, where this activity can be both individual and dialogic; is supported by cognitive, but also motivational processes; and leads to the individual arriving at epistemic ends (the target of epistemic aims). In the following, we describe our heuristic model in more detail (see Figure 1 for a graphical representation).
Central to our understanding of OESI is individuals' adoption of epistemic aims. In their AIR model of epistemic cognition, Chinn et al. (2014; see also, Chinn et al., 2011), identify epistemic aims, ideals and reliable processes that individuals apply to achieve epistemic ends. We describe all three components here briefly, before spelling out their relation to our notion of OESI. First, epistemic aims are “a subset of the goals people adopt, specifically those goals related to inquiry and finding things out” (Chinn et al., 2011; p. 142), and they are directed at achieving epistemic ends, for example, gathering “true” facts about a topic, avoiding misinformation on the topic, or acquiring a deeper understanding. Second, how much an epistemic end is valued will affect the selection of epistemic ends. An information seeker will review the success of an information search along her epistemic ideals, which could be described as the standard that determines whether a person has achieved her epistemic end; such a standard might be whether the information comes from a highly authoritative source or whether it is based on empirical evidence (Chinn et al., 2014; see also section Epistemic (meta-)cognition). And, third, to achieve epistemic ends, reliable processes are applied, which specify the conditions and cognitive operations to achieve reliable knowledge. Importantly, which processes are deemed reliable depends on the context and the individual's knowledge about the processes. For example, while observation is usually a reliable process to find things out about the (natural) world, individuals may overestimate the reliability of this process, which may lead to misconceptions (Chinn et al., 2014).
Epistemic aims underlie OESI and moderate transitions from stage to stage in our heuristic model (see Figure 1). First, when an individual is confronted with a socio-scientific topic in online media environments, which harbor specific constraints and affordances (see section Constraints and Affordances Entailed in the Context of OESI), this elicits cognitive and motivational processes, possibly leading the individual to form (an) epistemic aim(s). If so, these processes become more goal directed (as the individuals strives to arrive at an epistemic end). For example, if the individual adopts the epistemic aim of avoiding misinformation, she might consider more reliable processes in her search for information, such as referring to fact-check websites, which allow her to compare her achievements with her epistemic ideals (e.g., that accepted information must be evidence based). However, to adequately deal with context constraints and affordances (e.g., the amount of misinformation present in social media), the employed (reliable) processes must also be effortful. Such goal-directed and effortful engagement is what we describe as OESI, and we further differentiate individual engagement (engaging on one's own in the search, selection, evaluation, and integration of information) and dialogic engagement (engaging in discourse with others to interpret, articulate and critically examine scientific information).We assume that individuals will not follow a specific sequential order when engaging in these two types of engagement and their associated processes, but instead, depending on the situation and the individual's epistemic aim, any process could be the beginning of an episode of engagement and could lead to any other of the processes—within and between the two parts –, whereby the individual may even switch back and forth, commit to two processes at the same time, or skip a process. Finally, it is also possible for individuals to move back to previous stages: Practices of engagement may, in turn, motivate cognitive and motivational processes (e.g., if the individual feels self-efficient during critical information evaluation, she might be more motivated to achieve her epistemic aims). Furthermore, when the individuals arrives at her epistemic ends—or, instead, is partially or entirely unsuccessful in achieving her aim—she might reconsider her initial epistemic aims and enter another episode of engagement.
However, OESI may not lead to similar (and similarly measurable) achievements as does engagement in formal education settings. By defining outcomes as arriving at one's epistemic ends, we aim to highlight a central dilemma. Defining a successful outcome largely depends on which standards define achievement: personal (e.g., being content with a personal decision; relieving anxiety) or normative (e.g., achieving full understanding of a concept in alignment with the current scientific state of knowledge). We are aware that these aims require very different cognitive and motivational processes; consequently, we focus on engagement that is moderated by individuals' epistemic aims and we review research to find out which reliable processes are beneficial for achieving such aims, and for dealing with context constraints and affordances in the process [in contrast, Greene et al. (2020) recently focused on incidental learning in online environments]. Thus, the purpose of this article is to review the literature in several related fields in educational science and educational psychology to identify aspects of the context, and of individual's cognitive and motivational prerequisites that are especially beneficial or detrimental to effortful and productive OESI. Only when it is goal-directed and effortful can OESI lead to an individual successfully arriving at their respective epistemic ends.
Educational researchers and educational psychology researchers have long investigated individuals' reasoning and engagement with scientific and online information, and have posited educational implications; these researchers have delved much deeper into specific aspects relevant to our heuristic framework (e.g., Alexander and The Disciplined Reading and Learning Research Laboratory, 2012; Leu et al., 2013; Fischer et al., 2014; Tabak, 2015; Cho and Afflerbach, 2017; Breakstone et al., 2018; Britt et al., 2019; Coiro, 2020). Taking past conceptualizations into account, we use the term “online engagement with scientific information” not to introduce an entirely new concept or to replace any related concept; instead, here we review this literature, specifically to provide a comprehensive overview of OESI—focusing its context and on cognitive and motivational processes that support it—to derive implications for education and instruction.
Constraints and Affordances Entailed in the Context of OESI
Information that we consider relevant for OESI is acquired in online environments and (a) contains an elaborate claim on a socio-scientific issue, or (b) is detailed enough to serve as evidence, or (c) both. For example, we would consider as relevant any text, audio, and video sources, as well as images and graphical representations (e.g., a tweet featuring a graph, a YouTube video, an open access scientific article), but we would not consider as relevant a meme consisting only of a photograph and some text, which is only meant to entertain. For individuals to deal with such information to achieve their epistemic aims, they must overcome the constraints and utilize the affordances that is entailed in the respective contexts (Barzilai and Chinn, 2019). We will briefly outline these in Table 1.
Table 1. Some context constraints and affordances of Online Engagement with Scientific Information (OESI).
Two characteristics of scientific knowledge are especially challenging for laypeople to deal with (Bromme and Goldman, 2014; Hendriks and Kienhues, 2019). First, scientific knowledge is characterized by complexity (Keil, 2008) as scientific theories vary in depth (deep causal complexity) and breadth (interrelatedness with other theories or concepts) (Bromme and Goldman, 2014). Consequently, full understanding of scientific phenomena requires both highly specialized knowledge in one field (e.g., virology) and related background knowledge from many other disciplines (e.g., biology, chemistry). For many questions in socio-scientific issues, the complexity of (natural) scientific knowledge is further amplified by manifold interrelations with the social sciences. This is especially the case when issues entail risk, which can exist both on a personal level (e.g., health risks) and on a societal level (e.g., economic risks). Second, scientific knowledge is intrinsically uncertain (Friedman et al., 1999), whereby uncertainty arises not only during evidence gathering processes (e.g., measurement error, inadequacies of measurement), but also from lack of knowledge or expert disagreement (van der Bles et al., 2019). Scientific uncertainty is becoming increasingly apparent to a larger public as the COVID-19 pandemic progresses, because evidence is rapidly accumulated and published online (sometimes before peer-review), such that public debates often involve highly uncertain scientific knowledge.
Both the complexity and uncertainty of scientific knowledge are amplified in online information environments. Online, there are many possibly relevant information sources that vary in format (e.g., text, video), in genre (e.g., scientific, journalistic, opinion, entertainment), and in explanatory power (e.g., relevant to the topic and founded in evidence). Moreover, sources are highly interconnected; that is, online documents not only embed and interlink diverse formats and genres (Alexander and The Disciplined Reading and Learning Research Laboratory, 2012; Goldman and Scardamalia, 2013), but interconnectedness is also established when sources cite and embed sources of different quality (e.g., when a scientist is interviewed by conspiracy-affiliated news sites), or when scientific arguments are disputed by industry stakeholders. To the individual, this amplifies the complexity of an already complex scientific topic. But also, scientific uncertainty can be amplified, especially as new and yet uncertain results are highly accessible online. In particular (digital), media pieces often display disagreement between experts (Boykoff and Boykoff, 2004), such as when scientists openly disagreed with statements by the WHO about the effectiveness of wearing face masks to protect against COVID-19 (Howard, 2020). Furthermore, around publicly contested issues like climate change and vaccination, skeptics have been especially strategic about utilizing uncertainty to manufacture doubt around scientific knowledge on the issue (Oreskes and Conway, 2011) and attack scientific evidence especially in digital media (e.g., Elgesem et al., 2015; Mercer, 2018).
As a result of these constraints, laypeople find it challenging to engage with scientific knowledge online to achieve epistemic aims; yet, the context of OESI also entails affordances that individuals can utilize. Socio-scientific issues may motivate individuals to purposefully engage with scientific information, because the scientific questions are highly relevant, and are often contextualized in everyday life and societal questions (Feinstein and Waddington, 2020). Science fundamentally rests on the active dialogue about and the critique of scientific claims (Osborne, 2010), and members of the public can now contribute more to this dialogue through efforts such as the movement toward Public Engagement with Science (Leshner, 2003). Furthermore, increased access to scientific information via digital media creates even more opportunities for individuals to connect with science (Brossard and Scheufele, 2013). Especially because scientific knowledge is often communicated in very formalized ways in terms of formats and language use, digital media platforms grant laypeople the opportunity to learn about science in various different formats and in much more accessible and engaging language; for example, YouTube videos often use an entertaining and narrative style to communicate quality informational content. However, because individuals can access such a wide variety of sources, they must be able to identify not only trustworthy sources, but also communicative intentions to distinguish, for example, institutional public relations information from critical science journalism, and even from science-related entertainment. Moreover, online, individuals must be especially aware of messages that are deliberately posted to disseminate false information, called disinformation or “fake news” (a term that has also been weaponized in political contexts; Molina et al., 2019). In contrast, misinformation is spread without malevolent intentions (Molina et al., 2019; Scheufele and Krause, 2019), but it is still a threat toward an individual's engagement with scientific claims and evidence.
The requirement to effortfully seek out credible information represent the downsides of individuals' ability to be active agents in using and interacting with online digital media platforms (Evans et al., 2016), where they can deliberately choose to engage with certain technologies, media, and content. Furthermore, individuals may even create their own content and—utilizing digital media's social affordances (Hopkins, 2016)—interact and engage in dialogue with other users.
In the article, we refer to research that describes which cognitive and motivational processes people employ to deal with these context constraints and affordances. While we do differentiate some constraints and affordances for the two contexts, some individuals may perceive an aspect that we introduced as constraint to be more of an affordance, and vice versa. For example, a comment section to a blog entry might initially be an affordance, but dealing with a high number of reader comments may hinder individuals' evaluation of information, thus making it a constraint.
Individual Engagement
Searching for information to achieve epistemic aims is an iterative and dynamic process. To make sense of scientific information in order to achieve their epistemic aims on their own—to form “true” beliefs or understanding—individuals must employ reliable processes. To describe the necessary cognitive processes during an information search, we will first describe the MD-Trace (Multiple Documents-Task-based Relevance Assessment and Content Extraction) model (Rouet and Britt, 2011). According to this model, a search is initialized by an individual's mental representation of the searching task in a task model (see also, Rouet et al., 2017). Further, her task model also involves considering available knowledge and resources, such as prior topic knowledge and knowledge about search strategies (Rouet et al., 2017). As a result of these processes, the individual determines whether further information is needed to fulfill task demands and against what standard the search result should be compared. Having initiated the search process, she tests whether the sought information is relevant to her task model and selects documents accordingly. To process and evaluate the selected documents, the individual mentally represents them in an intertext model, which links contents of the documents to their meta-information (information about, e.g., the source, date, or rank of the search result), and includes intertext predicates (e.g., possible conflicts). Integrating information into the mental model allows the individual to coherently represent her acquired understanding of the issue. Finally, she may compare this integrated mental model against her initial task model to decide whether to redo certain steps of the search task or to go ahead with creating a search product (e.g., write an essay or make notes next to search results to further concretize a search task). However, at each step, individuals face several challenges (Rouet and Britt, 2011). In this section, we will summarize research on how searching, selecting, processing, and integrating scientific information are supported or hindered by aspects of the context and the individual's cognitive and motivational processes.
Constraints and Affordances of the Online Information Environment to Individual Engagement
When searching for information, media affordances determine how specific technologies are used. That is, while users may deliberatively choose to use technologies or digital media for the potential features they offer; at the same time, such features also determine the ways in which users can engage with the technology. For example, when acquiring (scientific) information, individuals tend to use only one type of search engine, which might be enforced by the default use of digital assistants commonly installed on smartphones and computers (Kammerer et al., 2018). Additionally, characteristics of a search engine result page (SERP), such as the algorithm it uses to present search results, the interface it offers for users to manually filter search results, or the sparsity of information it displays (i.e., a title, short excerpt of the web page, and the URL) may influence whether an individual selects any of search results and whether they perform any further search queries. Research indicates that individuals would rather view the highest-ranked search results within a SERP (e.g., Salmerón et al., 2013; Haas and Unkel, 2017), even if those results are less relevant (Pan et al., 2007). Further, younger users in particular might select search results based on superficial cues like the search result's title (Lai and Farbrot, 2014), or boldface or capitalization (Rouet et al., 2011). Also, the number of documents that individuals select seems to vary by task: When individuals are asked to find a discrete answer to a question (instead of answering in an open-ended way), they select more documents (List et al., 2016a). Furthermore, individuals do not use all features of a search engine that perhaps would allow them to conduct more appropriate search inquiries. Kammerer and Gerjets (2014) found that interfaces displaying the results in a three-by-three grid more often led users to select and view search results according to their trustworthiness than according to their search rank. Similarly, Salmerón et al. (2010) found that individuals had more efficient reading times and displayed more explorative search behavior when using a graphical-overview interface (i.e., indicating the semantic relationships between the search results) instead of a standard list interface. Prior knowledge about the search topic may further benefit an individual during an information search when the search engine interface allows it: Experts performed faster and more accurate searches than laypersons when the interface was semantically structured (Salmerón et al., 2005).
Second, the interconnectivity and embeddedness of information sources—both hierarchically (documents that are interlinked), and horizontally (one document that is embedded within another)—may be challenging for information seekers to deal with (Cho and Afflerbach, 2017; Goldman and Scardamalia, 2013). These features call for flexibility in how individuals access information (Shapiro and Niederhauser, 2004), namely they have to access information in a non-sequential, non-linear way. This might require some specific aspect of digital literacy: Although expert searchers (fact checkers) were found to perform lateral reading, that is, opening several browser tabs during a search to check the reliability of a search result, this was not done by topic experts (historians) or students (Wineburg and McGrew, 2019).
The goal-directed and effortful evaluation of online information may further be constrained by several context features of scientific information in digital media environments (Breakstone et al., 2018; Forzani, 2019), such as genre, presentation of information (such as the use of distracting imagery), or other users' endorsements. Unfortunately, individuals often use only superficial or unreliable indicators for determining the credibility of online information (Coiro et al., 2015; McGrew et al., 2018). For example, individuals may not be able to distinguish sponsored news content from unbiased news stories or to identify the verified social media accounts of public organizations (McGrew et al., 2018). Furthermore, the extent to which adolescents use social media sites for entertainment purposes can be negatively related to their ability to discriminate reliable from unreliable online information (Macedo-Rouet et al., 2019b). Some online platforms, and especially social media, seem not to be regarded as trustworthy by individuals in general. Wikipedia is sometimes dismissed as information source without considering its inherent quality control (Breakstone et al., 2018). Evidence suggests that individuals deem Twitter and blog entries less trustworthy than (for example) newspaper articles and refrain from citing them, even if they entail relevant first-hand information about an issue (List et al., 2017).
Further, the communicative design of scientific information appears to affect its evaluation. Using a more “scientific” language style, such as including descriptions of scientific methods and in-text citations, leads readers to judge the information as more “scientific” and believable overall (Thomm and Bromme, 2012). Over a series of studies, Scharrer and colleagues (e.g., Scharrer et al., 2012, 2017) found that when a scientific text was written in a comprehensible fashion (compared to when the text contained technical terms and was, thus, incomprehensible for laypeople), readers were more easily persuaded by the text's arguments and less inclined to consult further expert advice. Furthermore, when individuals are engaged online in argumentation, presenting a piece of information in the form of question and answer rather than in the context of a traditional text may be a more effective way to promote the acquisition of factual knowledge (Iordanou et al., 2019a). The question-and-answer format appears to have facilitated learning, possibly by highlighting the potential use of a particular piece of information.
Another feature of online environments is that not only social media and blogs but also many online news sites allow for user comments, which might influence how users evaluate the content of the main article. For example, attitudes about a scientific issue may be influenced by the perceived consensus among other readers expressed through blog comments (Anderson et al., 2014; Lewandowsky et al., 2019). Furthermore, in some instances recommendations and social endorsements might play a role in evaluation and could reflect on evaluations of the credibility of health messages and of the expertise of the author (Jucks and Thon, 2017). In one study, when Facebook posts were shared by a close friend, this only raised the credibility of otherwise distrusted news sources (participants rated their trust in several news sources prior to reading the posts) but not of trusted sources (Oeldorf-Hirsch and DeVoss, 2020).
To sum up, during the first steps of searching for and selecting relevant information, characteristics of the online environment [e.g., (social) affordances of SERPs and digital media, communicative habits in digital media] may constrain, but also inspire effortful cognitive processes when searching, selecting and evaluating information. Dual-process theories propose that—unless task or person characteristics require it—individuals will default to heuristic processing instead of effortful and systematic processing (Salmerón et al., 2013). In an online information search, a variety of heuristic cues determine whether a search result is credible or relevant to the task at hand (Hilligoss and Rieh, 2008; Sundar, 2008; Metzger and Flanagin, 2013). Taraborelli (2008) stated that research has mainly focused on predictive judgments of credibility evaluation instead of evaluative judgments; this means that individuals may often engage in a first selection phase to sort out low-quality information in which superficial cues guide information selection, whereas in a second step they might engage in more effortful evaluation (Hilligoss and Rieh, 2008). In fact, in one study, individuals' first selection of search results relied on the order of appearance in a SERP, but they bookmarked more relevant pages to examine further (Salmerón et al., 2013). In another study, individuals did first select links by their titles, but on second glance they considered cues more indicative of information quality, like URLs and snippets with brief descriptions (Hautala et al., 2018).
However, the activity of online searching itself may lead to a feeling of knowing—the case when an individual perceives to possess knowledge but cannot actually retrieve it from memory (Pintrich, 2000; Koriat, 2012). Such an overestimation of acquired knowledge (Fisher et al., 2015) may result from representing the Internet as transactive memory (an external, collective memory system), leading one to better remember where a previously learned item is stored than to recall the item itself (Sparrow et al., 2011). Similarly, searchers might experience a “feeling of findability,” where they overestimate the availability of information online (Risko et al., 2016). These problematic assumptions may stem from a failure to distinguish “what is known” from “how was this knowledge acquired” (Kuhn, 1999). Such knowledge illusions may bias the integrated mental model of search results and thus, may negatively influence the integration of information into a coherent representation of the issue. As such, when misrepresenting acquired knowledge as a result of an online search, the individual might give up on an epistemic aim prematurely due to the assumption that it has been already resolved.
Emotion and Motivation
Central to our understanding of OESI is identifying when individuals process information more effortfully instead of heuristically; importantly, the process of formulating epistemic aims and following through to resolve them might be strongly influenced by emotion and motivation. Referring back to dual-process theories, Griffin et al. (1999) identified several motivators for more systematic processing of information about risk. First, they found that the central motivators of information seeking were information insufficiency—when a person experiences a large gap between current knowledge and her personal sufficiency threshold (Griffin et al., 1999)—and a perceived normative pressure to be informed. Information insufficiency can follow affective responses to perceived risks (Dunwoody and Griffin, 2015). In fact, Yang and Kahlor (2013) found that while positive affect about climate change (e.g., hope) was related to information avoidance, negative affect (e.g., worry) was related to higher information insufficiency and the intention to seek information. Further, feeling personally threatened could bias how search terms are generated in an online search: Participants who were asked to reflect about a threat in their personal life generated more positive search terms in an unrelated Internet search than participants who were not instructed to think about personal problems (Greving and Sassenberg, 2015).
Similar notions end empirical evidence can be found in the literature on epistemic emotions, which are emotions directed at achieving epistemic ends (Muis et al., 2015). For example, enjoyment and curiosity may be positively related to the belief that justifying a knowledge claim requires critical evaluation, and anxiety and frustration may be lower when individuals believe that knowledge is uncertain (Muis et al., 2015). As such, different epistemic emotions may follow an experience of inconsistent or conflicting information. In fact, when individuals were surprised by incorrect answers in a trivia task (especially when their answers were given with high confidence) they had—as mediated by curiosity—more motivation to seek out explanations for these answers and request further information (Vogl et al., 2020).
In even more fundamental ways, the Cognitive Affective Engagement Model (CAEM) of multiple source use (List and Alexander, 2017b) addresses “learners' affective, cognitive, and behavioral involvement in multiple text use” (List and Alexander, 2017b, p. 184). Both situational and individual interest (Schiefele, 2009) have been found to promote learning and behavior (see also, Deci, 1992). Situational interest is a state that might be triggered by a single text (for example, when it is very easily comprehensible or coherent), while individual interest in a domain or topic is a trait-like personal characteristic (Schiefele, 2009). In consequence, the CAEM specifies an affective engagement dimension, which refers to an information seeker's interest and motivational involvement in the task at hand (also affected by topic-specific attitudes and prior beliefs), whereas the second dimension, behavioral dispositions, refers to the skills and strategies necessary for selecting, evaluating, and integrating information and documents at hand. By crossing these two dimensions, the CAEM states that learners fall into one of four default stances that guide their multiple-document comprehension: A “disengaged learner” selects and uses information without engaging much in evaluating and integrating. An “affectively engaged learner” accumulates information while engaging only in limited integration of multiple documents. An “evaluative learner” scrutinizes documents for relevance and credibility, but, due to limited motivational engagement, is less willing or able to fully integrate selected documents. A “critical analytic learner” possesses similar skills as the “evaluative learner” regarding source evaluation and verification, but since the critical analytical learner is highly motivated to engage in effortful and elaborate processing, they are able to succeed in integrating information into a coherent representation of the issue and, thus, might produce the most successful search result.
In sum, central motivators of goal-directed and effortful OESI are both personal relevance and topic specific risk perceptions (both affordances of socio-scientific topics). Furthermore, experienced information insufficiency may not be the only motivator to formulate epistemic aims; this may also be motivated by situational interest and epistemic emotions such as curiosity. Beyond individuals' skills to engage in reliable processes in dealing with scientific information, effortful evaluation and integration of information may also be fostered or constrained by emotions (both topic specific, e.g., hope or worry; and epistemic, i.e., directed at learning and understanding) and motivational involvement in the task.
Epistemic (Meta-)Cognition
Epistemic beliefs have long been investigated as part of reasoning and arguing about scientific information. Such beliefs about the nature of knowledge and knowing (e.g., holding beliefs about scientific knowledge being uncertain, complex, or needing expert justification) may incite the use of reliable processes and strategies during OESI. Several studies in which students were asked to think aloud during an online search have demonstrated that students use their epistemic beliefs to define standards for learning and accordingly select their strategies (Hofer, 2004; Mason et al., 2010a,b, 2011; Barzilai and Zohar, 2012). For example, beliefs about the complexity of an issue led individuals to reflect on the need to compare several documents and collect contrasting views (Mason et al., 2011), and the belief that knowledge is given and stable did co-occur with less use of strategies to actively construct knowledge from texts (Bråten and Strømsø, 2006). A person's epistemological understanding ties in with her metacognitive processes and strategies (Kuhn, 1999; Muis, 2007; Barzilai and Zohar, 2016), as it may directly influence the standards she sets for acquiring knowledge and understanding (Muis, 2007). As such, Barzilai and Zohar (2016) have argued that epistemic metacognitive knowledge (as a specific part of metacognition) may “guide the execution of cognitive-level epistemic strategies as well as their selection, monitoring, and evaluation” (Barzilai and Zohar, 2016, p. 414).
Furthermore, epistemic beliefs may also affect how effortfully individuals execute practices of OESI. Evidence from studies using the think-aloud technique shows that epistemic beliefs influence individuals' abilities to engage in evaluating information both while navigating the web—e.g., identifying argumentative fallacies (Mason et al., 2010b)—and while reading (Ferguson et al., 2012; Iordanou et al., 2019b). Further, viewing knowledge as tentative enhances meaning-making as one deals with multiple documents (Bråten and Strømsø, 2010) and supports credibility assessment of newspaper articles, for example when they present simplified accounts of an issue (Strømsø et al., 2011). Individuals with evaluativist epistemic beliefs engage more often in evaluating the credibility of evidence presented in texts and use scientific research as their standard for judgment; for example, they might consider the number of scientific studies supporting a particular piece of evidence (Iordanou et al., 2019b). Besides supporting the evaluation of single pieces of information, adequate epistemic beliefs also support the evaluation and integration of multiple pieces of information presented in different sources (Bråten et al., 2011; Barzilai and Eshet-Alkalai, 2015). Empirical evidence shows that adequate epistemic beliefs support the integration of information during online learning (Barzilai and Zohar, 2012) and during reading of multiple texts (Ferguson and Bråten, 2013), where comprehension mediates the relationship between epistemic perspectives and information-source integration (Barzilai and Eshet-Alkalai, 2015).
In sum, beliefs about the nature of scientific knowledge may directly influence which strategies and practices are employed during OESI (Muis, 2007; Barzilai and Zohar, 2016), and may also affect the epistemic ideals by which epistemic ends are evaluated (Chinn et al., 2014). That is, in addition to an individual's scientific literacy (see section Evidence Evaluation and Scientific Literacy), her epistemic beliefs may inform how she assesses the uncertainty and complexity of scientific information, and these beliefs may also guide the selection and metacognitive regulation of reliable processes for achieving her epistemic aims.
Source Evaluation
Due to limited gatekeeping of scientific information online (vs. editorial gatekeeping in scientific journals or traditional media), evaluating the source of scientific information is an especially important process within OESI, as it underlies the selection, evaluation, and integration of credible information. When retrospectively justifying document selection, students used epistemic criteria (e.g., source type, author) less often than non-epistemic criteria (e.g., order in the search list, relevance), but the more epistemic justifications were made, the more arguments and citations they presented in an open-ended search result (List et al., 2016b). However, individuals prefer authors of information to have good reputations (Rieh, 2002; Hilligoss and Rieh, 2008; Winter and Krämer, 2012); more specifically, readers tend to select blog posts by experts who possess relevant expertise on the topic in question (Winter and Krämer, 2014) and prefer disciplinary relatedness of search results to mere lexical similarity with search terms (Keil and Kominsky, 2013).
Diverse research findings suggest a variety of cues that individuals consider during source evaluation. First, the experts' language use seems to affect how trustworthy she is perceived. Individuals develop expectations about what constitutes appropriate language in different social and cultural contexts, and, thus, language accommodation or non-accommodation by speakers (reflecting their intentions and motives) may influence how individuals evaluate a speaker (Dragojevic et al., 2016). For example, an expert's use of technical language in scientific information may lead to her being ascribed higher expertise (Thon and Jucks, 2017) as well as higher integrity and benevolence when her use of (technical) language is appropriate to the context, e.g., when she uses less technical language when addressing laypeople (vs. experts) in online health forums (Zimmermann and Jucks, 2018), or less aggressive language in an online video (König and Jucks, 2019). Furthermore, the perception of a communicator in an online video as being comprehensible and entertaining also led to higher ascriptions of trustworthiness (Reif et al., 2020). Individuals also take an expert's motives into account when evaluating trustworthiness; for example, readers were more inclined to trust a scientist when they believed the scientist intended to inform rather than persuade them (Rabinovich et al., 2012), when the scientist provided a two-sided stance (instead of a one-sided stance) (Mayweg-Paus and Jucks, 2018) or mentioned the ethical aspects of a scientific issue (Hendriks et al., 2016). Furthermore, people perceived a source to be less trustworthy when the source had a vested interest in a claim (König and Jucks, 2019; Gierth and Bromme, 2020); this even sometimes motivated people to engage in effortful processing of complex evidence (Gierth and Bromme, 2020).
While these findings suggest that individuals are often able to adequately judge the trustworthiness of sources, research on “sourcing” (referring to when individuals pay attention to and use source features, such as the author, but also publication date) in multiple-document comprehension has found that students often fail to pay attention to source information (Britt and Aglinskas, 2002; Sandoval et al., 2016; for a review see, Brante and Strømsø, 2017). In fact, when evaluating multiple documents, individuals may not attend to author competence at all, and younger individuals (in elementary and middle-school) even failed to do so when explicitly prompted to evaluate sources (Macedo-Rouet et al., 2019a; Paul et al., 2019).
However, interacting with online information might not hinder successful sourcing per se. For example, reading an online document (instead of its printed-out version) increased memory for sources, which helped readers construct coherent interpretations of the issue at hand (Salmerón et al., 2017); that is, it helped them integrate information. Further, interacting with multiple sources is more effective than reading a single source for text comprehension and establishing source and content integration (e.g., Le Bigot and Rouet, 2007; Stadtler et al., 2013; Stang Lund et al., 2019); that is, individuals seem to have increased awareness about source information and create stronger content-source links when a conflict cannot be resolved by content information alone (Braasch et al., 2012; Strømsø et al., 2013; Stadtler and Bromme, 2014) or when information conflicts with prior beliefs about a topic (Bråten et al., 2016). As such, conflicts within single or multiple texts, as well as conflicts between newly acquired information and prior knowledge, might promote more effortful and strategic evaluation of sources (Braasch and Bråten, 2017). Further, relevant prior topic knowledge seems to benefit individuals' sourcing abilities (Stang Lund et al., 2019), whereas individuals with low prior knowledge may even prefer untrustworthy information sources (Bråten et al., 2011). In sum, while individuals use many different cues to determine source trustworthiness, encountering conflicting information about socio-scientific issues online seems to motivate individuals to engage in more effortful (source) evaluation and integration of information.
Evidence Evaluation and Scientific Literacy
Evaluating the strength of evidence (or even its inner-scientific significance) should be central to individuals' consideration of information from a normative standpoint, but this is challenging for laypeople considering the uncertainty and complexity of scientific information and their own bounded understanding of science (Bromme and Goldman, 2014). One possibility to rate the credibility of scientific claims would be to assess argument strength and structure, for example whether a claim is backed by evidence. While laypeople adequately assess argument strength to be greater when it is supported by a greater amount of evidence (Corner and Hahn, 2009; Hendriks et al., 2020), they may sometimes not take prior studies into account when assessing the probability of an effect to be true (Thompson et al., 2020). Individuals might assume that the tentativeness included in scientific information means that the scientific results have limited credibility (Flemming et al., 2015); however, in one study that gave readers a refutation text alerting them that this assumption is wrong, the assumption was successfully reduced (Flemming et al., 2020). Similarly, a stronger epistemic belief regarding the uncertainty of science might alleviate the adverse effects of scientific tentativeness on the credibility of information (Rabinovich and Morton, 2012; Kimmerle et al., 2015). However, when making inferences from evidence, people may follow a causality bias, such as when interpreting correlational data (Shah et al., 2017). That is, new evidence may be rejected if it does not fit within a broader single causal framework (Koslowski et al., 2008). Further, it is unclear which type of evidence individuals consider to be informative. Although some studies have indicated that statistical evidence (citing a study), expert statements, and causal evidence are perceived to be more persuasive than anecdotal evidence (Hornikx, 2008), adding anecdotal stories into scientific news articles decreased the extent to which participants engaged in scientific reasoning about the evidence (Rodriguez et al., 2016). Moreover, individuals often do not take multivariate causality into consideration (Kuhn, 2020). Thus, successful online information behavior on complex topics is constrained by individuals' tendencies to think simplistically about complex issues instead understanding that most phenomena are caused by multiple contributing factors or, for judgments of a non-causal nature, taking multiple considerations into account (Kuhn and Iordanou, 2020).
Basic scientific literacy will also likely help individuals successfully evaluate and integrate scientific evidence. Internationally, educational frameworks for scientific literacy (e.g., OECD, 2017; National Research Council, 2012) have emphasized that a central aim of science education should be to familiarize students with processes of scientific inquiry, evidence evaluation, and argumentation. Scientific literacy has been ascribed three core dimensions: content knowledge (about a few core scientific concepts), procedural knowledge, and epistemic knowledge (Kind and Osborne, 2017). As such, it is important to consider how individuals understand not only the processes of doing science but also the modes by which it achieves reliable knowledge, such as expert epistemic practices (Golan Duncan et al., 2018). Kienhues et al. (2018) recently argued that “science-based arguments can be understood and judged by criteria on three layers of scientific knowledge: (1) the ontology, (2) the methods and sources, and (3) the social practices required for the generation and justification of the argument” (Kienhues et al., 2018, p. 253). They argue that everyday evaluation of scientific arguments may benefit from switching between these layers. For example, when it is not feasible to come to a conclusion about a scientific issue based on reliable evidence (maybe due to conflicting pieces of evidence), the individual may switch to investigating which scientific processes were used, which will help them identify which argument is backed by stronger evidence. If that is not feasible, the individual might judge whether the conflicting positions might be partly due to the complexity of the topic or the motivations of the involved experts behind the conflicting positions (Dieckmann et al., 2017; Thomm et al., 2017). Even if someone has limited content knowledge, they can still be successful in assessing a scientific issue online by determining, for example, whether there is consensus among scientists about an issue (a social practice of science; Oreskes, 2007; van der Linden et al., 2015) and then adopting the consensus view.
To summarize the two previous sections, individuals themselves often cannot adequately evaluate the credibility of a provided scientific claim, and some have argued that in such a case it is instead more feasible to evaluate the trustworthiness of the information source (Bromme and Goldman, 2014; Hendriks and Kienhues, 2019). That is, holding epistemic ideals regarding the justification of knowledge in consensus, or by a highly trustworthy source might be more beneficial for deciding whether to accept online information as provisionally true. Hence, instead of asking “What is true?,” individuals can rather solve the problem by asking “Whom do I believe?” (Bromme et al., 2010; Stadtler and Bromme, 2014). Hendriks et al. (2015) define epistemic trust as the willingness of a person to depend on an information source for knowledge; this trust is not blind, however, but relies on a person's epistemic vigilance toward cues that indicate whether an information source might be deceptive or ignorant (Sperber et al., 2010). In digital settings, evaluations of epistemic trustworthiness of expert sources rely on considering an expert's expertise (possessing relevant knowledge), integrity (adhering to the rules of their profession), and benevolence (having the interest of others at heart) (Hendriks et al., 2015).
(Prior) Attitudes and Beliefs
Prior topic knowledge and attitudes can affect processes of individual engagement from the start of setting up a task model search to the (internal) formulation of a solution. On the one hand, prior topic knowledge and attitudes can result in individuals using more appropriate keywords and selecting more relevant information (e.g., MaKinster et al., 2002), on the other hand, they may also bias the information search. Selective exposure to information (sometimes referred to as confirmation bias) means that an individual is more likely to select attitude-consistent information (Fischer et al., 2005; Rothmund et al., 2015; Knobloch-Westerwick et al., 2020), and also evaluate that information more favorably (van Strien et al., 2016; Strømsø et al., 2017). An explanation for selective exposure during an information search might be defense goals, whereby an individual ignores or dismisses counter-attitudinal information to preserve their own worldview (Cappella et al., 2015; Winter et al., 2016). Nevertheless, those information seekers with high need for cognition are more likely to select two-sided information (e.g., suggested by the link title) for further reading (Winter and Krämer, 2012). Prior knowledge and attitudes may also detrimentally affect the evaluation and integration of scientific information online. Arguably, prior beliefs are internal representation with which newly acquired information has to be integrated. Richter (2015) assumes a “text-belief consistency effect” for integrating information into mental (situation) models. In fact, research shows that prior beliefs and attitudes might affect the way a person evaluates information and integrates new evidence into their internal representation of an issue. Chinn and Brewer (1998) showed that only in very few cases did anomalous evidence (evidence inconsistent with individuals' already established theories) result in careful consideration and adaptation of individuals' theories; often, such evidence was just ignored or discounted.
Motivated reasoning is also an important drive for rejecting information that is not consistent with the dominant belief in an individual's social group (Kahan, 2013). For example, group identity may cause individuals to apply defensive motivations when reading about scientific issues and, in consequence, might further strengthen the text-belief consistency effect (Maier et al., 2018). In one study, Nauroth et al. (2015) showed that people who self-identified with the social group of gamers devalued identity-threatening scientific information (e.g., playing video games increases violence in youth) that was presented in a science blog, and, when allowed to post a comment, they criticized the methodology of the scientific study. Further, in another study identity-threatening information affected how reputable and competent participants perceived the scientist authors to be (Nauroth et al., 2017). However, biased evaluation of scientific evidence may not only arise from an identity threat but also from a threat to one's general values. For example, the more central a person held non-violence to be in their self-concept, the more positively they evaluated a scientific study that claimed video games promote violence (Bender et al., 2016). Also, expert sources may be considered more credible when the ethical stance of the reader aligns with that of the source, leading to higher agreement with the source's claims (Scharrer et al., 2019).
In sum, prior beliefs and attitudes may play a central—and often detrimental—role in establishing a task model for searching for scientific information, as well as evaluating and integrating information. However, sometimes, prior beliefs may motivate effortful processing and evaluation of documents (Rouet and Britt, 2011; List and Alexander, 2017b; Rouet et al., 2017)—for example by eliciting curiosity by being unexpected (see section Emotion and Motivation) or evoking situational interest—allowing individuals to switch from belief protection to belief reflection (List and Alexander, 2017b). By judging the plausibility (“the potential truthfulness of a claim”; Sinatra and Lombardi, 2020, p. 5) individuals may utilize their prior knowledge by allowing them to select the most likely alternative, especially when an issue is contradictory and uncertain. Lombardi et al. (2016) provided a theoretical framework for plausibility judgments, which entail (a) alignment with prior knowledge and beliefs, (b) complexity of and (c) perceived conjecture within novel information, (d) judgments of source trustworthiness, and (e) the individual's heuristic processing and possible biases. Plausibility judgments may be guided by different degrees of evaluation. While most judgments are implicit (due to a preference for heuristic processing, see above), individuals' epistemic dispositions and motives (e.g., need for cognition) may lead to more effortful processing. Further, if motivated (e.g., if they are interested and self-efficient), individuals may also reappraise their original judgements, guided by more explicit processing and increased effort in reasoning. In consequence, Sinatra and Lombardi (2020) suggested that fostering individuals' capabilities to quickly make plausibility judgments about information—by efficiently employing their prior beliefs and knowledge—may be more fruitful in “post-truth” contexts (similar to the contexts we previously described for OESI) than training effortful strategies to evaluate information and its sources.
Dialogic Engagement
Besides seeking and evaluating information independently to form beliefs, OESI includes engaging in discourse with others to share, interpret and critically examine scientific information. In this sense, social media platforms have emerged not only as an important source of information (Head and Eisenberg, 2010; Kim et al., 2014), but also as a public forum for engaging with science (Baram-Tsabari and Schejter, 2019). In fact, we perceive individual and dialogic engagement as reciprocal processes. For example, individually forming an understanding of an issue is immediately beneficial for constructing arguments when engaging in dialogue with others, and, conversely, dialogue and deliberation with others might one to revise their original understanding (see section Reciprocity of Dialogic and Individual Engagement).
When we consider OESI as a social process, it involves the overlapping processes of interpreting information, building arguments from that information and contrasting those arguments with competing arguments. Berland and Reiser (2009) propose that these processes, which they refer to as sensemaking, articulation and persuasion, respectively, form the foundation of scientific argumentation. Although scientific argumentation can be an individual process, as a dialogic process it presents a unique set of affordances and constraints. In the following sections, we explore these affordances and constraints and propose ways in which scientific argumentation as a social process can be leveraged to focus the epistemic aims and outcomes of OESI.
Constraints and of Affordances of the Online Information Environment to Dialogic Engagement
Many different social media platforms exist, and their functions range from social networking and community building to collaborative knowledge construction and sharing (Leonardi, 2015; Krancher et al., 2018). Building on these potential functions, social media platforms may benefit the motivations and outcomes of OESI (Gao et al., 2012). However, to understand and to exploit the full potential of communication for using online information successfully with others on social media, we need to consider the role that a social media platform's characteristics play in users' abilities to select and establish network connections and to interact with other users (DeNardis, 2014). Following Ariel and Avidar (2015), the degree of interactivity is thereby not primarily determined by the technical features of a platform (interactivity as a medium characteristic) but rather by the actual aims and behaviors of its users (interactivity as a process-related variable). In other words, social networks such as Facebook, Twitter and Instagram do not necessarily produce interactive communication behavior per se, but rather they provide opportunities for different ways of communicative exchange.
In this regard, Rafaeli (1988) interactivity model suggests three possible types of messages in communication. The first type refers to one-way communication between a sender and a receiver, and messages are characterized by low responsiveness. The second type allows for two-way directional communication, as the receiver of a message becomes a sender and is, therefore, responsive to the information provided (or posted). However, only the third type enables real interactivity in a two-way flow of messages between users and is, therefore, highly responsive. Here, such interactive messages encourage the interaction to continue back and forth. Consequently, the transmission of information can be seen as the center of interaction, and interactivity seems to be a central attribute of the process of communication itself (Rafaeli and Ariel, 2007; Ariel and Avidar, 2015).
Types and Goals of Dialogue
When we think about using information to communicate with others online, we should also think about the purpose of such communication. Two-way communication, or dialogue, can be divided into different types, each with a particular set of epistemic aims (Rapanta and Christodoulou, 2019). Walton (2010) identifies seven dialogue types that apply to communication in both face-to-face and online settings. These are information-seeking, discovery, inquiry, deliberation, negotiation, persuasion, and eristic dialogue (or “irrational dispute”). All are argumentative insofar as speakers posit how information can be brought to bear on claims, but they differ in their initial state and intended outcomes. For example, both inquiry and persuasion involve making claims with evidence, but inquiry focuses on collecting evidence to test claims, while persuasion focuses on citing claims and evidence to defend a conclusion. Dialogue types can also be distinguished by their social-emotional goals. To get at the role that personal stakes can play in dialogue, Asterhan (2013) proposes a distinction between competitive interpersonal goals and collaborative interpersonal goals. The former are competitive in that speakers take an adversarial stance on what they perceive to be zero-sum outcomes, and the latter are collaborative in that speakers take a cooperative stance on what they see as a shared enterprise. It is important to note that these interpersonal goal states are distinguishable from dialogue types. Some dialogue types may be more likely than others to trigger competitive interpersonal goals (persuasion, negotiation, and eristic, for example), while others may tend toward collaborative interpersonal goals (information-seeking, inquiry, and deliberation). However, interpersonal goals represent social-emotional outcomes that are distinct from the competitive or collaborative epistemic aims used to define dialogue types (except perhaps for eristic argument, which is primarily driven by interpersonal conflict). For example, negotiations can be conducted either collaboratively or competitively, depending on the stance, strategies, and dialogic moves chosen by each party (Lewicki et al., 2001). Likewise, although deliberations aim at group consensus, they may unfold as either collaborative or adversarial exchanges depending on the ways in which interpersonal dynamics emerge and are negotiated during dialogue (Tuler, 2000).
Based upon these considerations, we now focus on the potential benefits of argumentative dialogue as a two-way communication mode for addressing OESI in the context of dealing with (conflicting) scientific knowledge and socio-scientific issues. Numerous studies point out that dealing with complex content within argumentative dialogue has a positive effect on reasoning about information in online contexts [an overview is given in a meta-analysis by Noroozi et al. (2012)]. In order to successfully co-construct an elaborated understanding of an issue (e.g., Teasley, 1997; Chi, 2009), users need to apply “reasoning that operates on the reasoning of another” (transactivity, Berkowitz and Gibbs, 1983, p. 402). In this sense, transactive dialogue as a specific form a two-way argumentative requires coherent reference and mutual elaboration of each other's contributions by aiming at the integration of different knowledge backgrounds and perspectives (Asterhan and Schwarz, 2016). However, before well-elaborated consensus building is achieved, each contribution needs to be scrutinized critically (conflict-oriented consensus building; Fischer et al., 2002). Accordingly, an important feature of this type of consensus building is that individuals do not accept contributions of their partners as they are. In this context, efficient communication comprises strategies that directly address and challenge the argumentative structure and content (e.g., scientific evidence) of the other's contributions (Mayweg-Paus and Macagno, 2016; Mayweg-Paus et al., 2016a). In particular, critical questioning seems to be a strong argumentative strategy given its capacity to address deeper grounds of disagreement, bringing into light background knowledge and knowledge beliefs that might otherwise escape attention. In such cases, a goal is to avoid pseudo-agreements or pseudo-disagreements (Jucks and Paus, 2013) and to focus the discussion on the true source of differences in opinion. Consequently, asking critical questions seems to play a pivotal role in the context of knowledge construction (Chinn and Osborne, 2008) and for developing insights into not only science-related issues (Mayweg-Paus et al., 2016b; Thiebach et al., 2016) but also history (Wissinger and De La Paz, 2016) and public policy (Song and Ferretti, 2013).
When individuals hold one another accountable to standards for accurately collecting and interpreting information and validly using information as evidence, two-way communication offers distinct advantages over one-way communication. However, a two-way discussion can also undermine the quality of reasoning about evidence. The same set of forces that drive motivated reasoning when individuals think alone [see section (Prior) Attitudes and Beliefs] can also compromise reasoning when we engage in dialogue. Critical discussions, particularly those that polarize views on a topic (Kuhn and Lao, 1996), can prompt individuals to both overvalue confirming evidence and discount disconfirming evidence (Schulz-Hardt et al., 2002). This phenomenon is particularly concerning in Internet forums that attract users with polarized views on public issues [Baek et al., 2012; see also section (Prior) Attitudes and Beliefs]. Dialogue can also elicit adversarial behaviors that undermine the potential benefits of two-way communication. Thus, under some conditions, the competitive epistemic goals of persuasion can trigger competitive interpersonal goals that foreclose transactive dialogue (Asterhan, 2013; Felton et al., 2019). When speakers confuse the two goals, they tend to repeat themselves without elaborating their arguments, disagree without explaining why, and advance a barrage of arguments without addressing each other's counterarguments (Felton et al., 2015b). On the other hand, two-way communication can also trigger collaborative interpersonal goals that undermine dialogue. Several studies suggest that face threat can lead speakers to avoid critical discussion (See, e.g., Asterhan, 2013; Felton et al., 2019). The phenomenon may be particularly problematic when speakers encounter disagreement unexpectedly during in-group dialogue. In these circumstances, speakers are more likely to prioritize group or interpersonal cohesion over engaging in critical discussion and transactive dialogue (Concannon et al., 2015).
Diverging Opinions and Dialogic Engagement
Collaboratively achieving epistemic aims in dialogic argument depends substantially on the discourse partners' efforts to deeply elaborate on and challenge their partner's knowledge and arguments (e.g., Kuhn and Udell, 2003). In this context, the dialogic character (or two-way mode) of argumentation can support OESI through (a) enhancing the quality of argumentation and the use of evidence (Crowell and Kuhn, 2014; Mayweg-Paus and Macagno, 2016) and (b) the evaluation and reconciliation of diverging claims (Nussbaum and Edwards, 2011; Felton et al., 2015a). In an argumentative dialogue, a person is subject to the interlocutor's scrutiny of her own position, which enhances her need to be more critical not only toward her own position but also the opposing one. In such dialogues, the reasons for preferring one point of view or one piece of evidence over another must be analyzed by taking a critical stance toward the presented evidence (Osborne et al., 2004). This challenge can only be addressed by drawing on more sufficient evidence and elaborating more and in greater depth on the different viewpoints and their backings (burden of proof, Walton and Macagno, 2007; Macagno and Walton, 2012).
There are several ways to address these potential challenges to effortful two-way, critical discussion. One effective strategy is to mitigate or de-emphasize competitive interpersonal goals by focusing attention on the epistemic aims of discourse. In the context of one-way communication, giving individuals specific instructions to generate reasons (Ferretti et al., 2000) or counter-arguments and rebuttals (Nussbaum and Kardash, 2005) have reduced my-side bias in writing when compared with instructions to persuade the audience. In two-way communication, focusing on collaborative epistemic aims (deliberation) as opposed to competitive epistemic aims (persuasion) in dialogue can lead to decreased interpersonal competitive behaviors and an increase in transactive dialogue (Felton et al., 2015b). These same conditions can also mitigate confirmation bias (Villarroel et al., 2016). However, it is important to note that in these studies, speakers were paired with someone who disagreed with them on the topic of discussion, and, therefore, the studies were designed to elicit the critical dialogue. But also, explicit expressions of disagreement in Youtube comment sections have the potential foster collaborative interaction (Dubovi and Tabak, 2020). What emerges in studies that compare competitive and collaborative epistemic aims is an optimization problem. Dissent is a valuable component in overcoming motivated reasoning, particularly when measures are taken to reduce the risk of losing face (Schulz-Hardt et al., 2006). Thus, dialogue can be structured to explicitly make room for dissent (Schulz-Hardt et al., 2006). However, cognitive engagement is an important ingredient in such conversations (Kuhn and Lao, 1996), and focusing on transactive dialogue aimed at epistemic aims enhances the quality of reasoning. Individuals must hold themselves accountable to expressing disagreement when it arises to avoid quick consensus while simultaneously focusing on collaborative interpersonal goals to promote transactive dialogue (Asterhan, 2013; Thiebach et al., 2016). Ultimately, collaboratively achieving epistemic aims involves focusing dialogue on epistemic aims while threading the needle of interpersonal goals to produce a social-emotional context that fosters critical discussion.
Reciprocity of Dialogic and Individual Engagement
Collaboratively dealing with diverging (or even conflicting) claims might hold potential for the development of individual epistemological understanding, as it brings into light the existence of multiple perspectives and can promote a more balanced integration of pro and counter arguments in one's line of reasoning. Empirical evidence shows that individuals—after engaging in intervention studies that allowed them to engage in both argumentation with peers through the computer and in reflective activities for an extended period of time—showed improvements in their ability to evaluate others' arguments and the evidence that supported their arguments (Iordanou and Constantinou, 2015; Mayweg-Paus et al., 2016b). Further, engaging in an online discourse with peers who held an opposing view (vs. the same view) led to different inquiry behavior during online discussions and to different gains in terms of argument skills. In particular, in Iordanou and Kuhn (2020) study, individuals who engaged online in discussions with peers holding an opposing view chose to search for information regarding the opposing alternative first when given the opportunity. In contrast, individuals who engaged in online discussions with same-side individuals preferred to seek information related to their own position. Differences were also observed in the prevalence and types of functional evidence-based argumentive idea units in individual final essays, and they favored the students who engaged online in discussions with peers holding an opposing view. Here, engagement in online discussions with individuals holding opposing or same-side views may have fostered an epistemological understanding of recognizing that the other is reasoning from a perspective different from one's own, but that this perspective is still worth examining (Iordanou and Kuhn, 2020), or that it is important to take a step back and re-evaluate one's own understanding (Forzani, 2019). However, most people typically show difficulties with being able “to construct fully justified dual-position arguments and to explain and reconcile differences between accounts” (Barzilai and Ka'adan, 2017, p. 223). Apparently, recognizing multiperspectivity does not automatically mean one can apply sophisticated strategies when evaluating opposing views or arguments. Based on several empirical findings, Kuhn (2019) addresses this point by suggesting that understanding multivariable causality is a link toward evaluating and integrating multiple perspectives. Following this approach, OESI should include information- (or knowledge-) seeking activities for identifying and negotiating the multiple factors that can cause a phenomenon and to bring them into ongoing discussion.
In sum, dialogic engagement can take a number of forms depending on interactivity (one-way, two-way bounded, two-way unbounded), epistemic purposes (information seeking, discovery, inquiry, deliberation, negotiation, persuasion, eristic), and interpersonal dynamics of communication (collaborative, competitive). When we combine these variables, a complex array of permutations results. When individuals engage with others online about information, they gain access to critical dialogue that can enhance reasoning by focusing attention on the epistemic aims, ideals and reliable processes governing the use of information (Chinn et al., 2014). These epistemic concerns, when combined with critical dialogue, enhance reasoning about information and may even promote growth that transfers to individual engagement. However, to be successful in this endeavor, individuals must work collaboratively with others to examine their reasoning, even when epistemic aims are competitive.
Conclusion and Implications
In this paper, we have addressed conditions that may benefit, but also hinder effortful online engagement with scientific information (OESI). The Internet offers users immediate access to a wide variety of information on socio-scientific issues, and also allows for user agency and interactivity. Coiro (2015) argued that, in theory, the Internet is an ideal place to engage with (scientific) information to achieve deeper learning and understanding and—from a reading perspective—she presents strategies learners need to achieve such epistemic aims. However, it is not feasible to assume that readers will allocate their cognitive and motivational resources to the systematic processing of all information they find online regarding an issue of interest (Stadtler, 2017). As such, our review collects literature on cognitive and motivational processes that may help individuals overcome constraints and utilize affordances of scientific information in the online environment. Before concluding this literature review with a discussion of our heuristic model of OESI, we elaborate on how to foster individual and dialogic OESI in (higher) education.
We have discussed several context factors that may both constrain and motivate effortful OESI. Dealing with the uncertainty and complexity of scientific knowledge (emphasized in online information environments) is a challenge that might be hard to overcome (Kienhues et al., 2020). In consequence, individuals might often encounter conflicts—of newly acquired information with their own beliefs, between information sources, or between their beliefs and those of their dialogue partner. As this review has shown, critical and deliberative scrutiny of information is central to OESI. That is, engagement should be directed at achieving epistemic aims while holding oneself and others accountable to appropriate epistemic ideals, at applying reliable processes in information search, selection, evaluation, integration, and in engaging in dialogue with others (in line with apt epistemic performance, Barzilai and Chinn, 2018). However, as we have discussed, cognitive biases (such as confirmation bias, motivated cognition, and competitive interpersonal goals) constrain otherwise reliable processes and may sometimes emerge under the guise of “critical thinking” (e.g., being critical toward experts' claims also has become a rhetorical device of science skeptics). As such, in order to counter one-sided reasoning and argumentation, open-minded thinking is directly beneficial to effortful OESI, because it entails the willingness to hold up all views (including one's own) under scrutiny of critical examination, even taking on the risk of identity threats, in order to follow through with epistemic aims (Taylor, 2016). Open-mindedness has been shown to not only benefit individual engagement with scientific information, such as knowledge about scientific issues and argument evaluation (Sinatra et al., 2003; Southerland and Sinatra, 2006; West et al., 2008), but also dialogic engagement in dialogue with others (Kuhn and Udell, 2003, 2007). We argue that it is through a balance of (individual or dialogic) critical examination of information and open-minded thinking that goal-directed and effortful OESI emerges. Sharon and Baram-Tsabari (2020) provide examples of several educational approaches to foster open-minded thinking, such as exposure to exemplars of virtues and practicing virtuous behaviors.
One environment that holds high potential for directly instructing critical and open-minded thinking by employing authentic search tasks is higher education classrooms. This environment is suitable for two reasons: First, students are already instructed to successfully deal with theories, models, evidence, and arguments within their discipline. Golan Duncan et al. (2018) identify that understanding experts' evidentiary practices (how experts analyze, evaluate, interpret, and integrate evidence to derive and inform theories) and being able to rely on scientific evidence even though one's own understanding of science is bounded (lay epistemic practices) are central for laypeople's ability to deal with scientific evidence. Searching for information on socio-scientific topics (related somewhat to a learner's area of expertise) is an ill-structured but solvable task, and it may also allow for reflection of the boundaries of students' expertise, especially when they are granted the opportunity to engage in dialogue with students from different disciplinary backgrounds or with diverging views on the issue. Second, while scientific inquiry tasks, such as lab work, are important to achieve procedural knowledge in their own discipline, there is limited opportunity for learners to engage in understanding of the social processes that are used to create reliable knowledge; however, both scientific knowledge as well as digital media entail social affordances allowing for dialogic engagement in authentic search tasks.
We have previously argued that the two parts—individual and dialogic engagement—are reciprocal rather than separate or sequential. While individual engagement might prepare the individual to engage in dialogue with others, such dialogic engagement might not only induce more individual engagement, but it may also foster skills and strategies needed for practices in individual engagement. Engaging learners in collaborative reasoning and argumentation about scientific information fosters individuals' epistemic cognition (e.g., Iordanou, 2016; Fisher et al., 2017), but it also creates a space to collaboratively reflect and elaborate on online scientific information in two ways: First, individuals may discuss the quality of online information, and, second, they may critically reflect and reason collaboratively on the criteria that guided their evaluations (Barzilai and Chinn, 2018). Thus, dialogue with others entails the potential to reflect on both one's own and others' individual engagement practices (Mayweg-Paus et al., 2020). In particular, to promote the development of epistemological understanding in their students, educational scholars need to address searching to learn as an information-seeking activity within the process of argumentation as well as learning to search in the context of argumentative dialogue (Redfors et al., 2014), which works as a mechanism for critical reflection on sourcing strategies, information providers, and media, and may also serve knowledge co-construction (Dubovi and Tabak, 2020). In this way, online dialogue becomes not only a medium for the transfer of information but also a means by which we gain epistemological insight into the nature of information and its many uses in our communication with others.
In our heuristic model, two aspects are not discussed in further depth. First, we decided not to define the cognitive and behavioral manifestations of the practices of engagement. Several descriptive models and literature reviews exist that describe one or several of these practices and their interrelations in more detail (in the context of multiple documents comprehension: e.g., Rouet and Britt, 2011; List and Alexander, 2017a; 2017b; epistemic cognition: e.g., Chinn et al., 2011, 2014; digital literacy: e.g., Cho and Afflerbach, 2017; Coiro, 2020; functional scientific literacy: e.g., Tabak, 2015). Second, we have not described how individuals would achieve their epistemic aims (the outcome of engagement), and whether it is always feasible to assume that individuals would always achieve these through goal-directed and effortful OESI. While there are models outlining knowledge integration with prior information (Richter, 2015), integration of diverging sources (Braasch and Bråten, 2017), and knowledge co-construction through collaborative dialogue and argumentation (Asterhan and Schwarz, 2016; Iordanou et al., 2019a), further research should investigate how knowledge construction takes place in authentic online information search (in contrast to dealing with provided information—often in text form—in a research or classroom setting), especially taking into account online-specific constraints and affordances. Newer studies have increasingly included combinations of process and outcome variables to more comprehensively examine online engagement (e.g., Bråten et al., 2014a,b; List and Alexander, 2018; Kammerer et al., 2020), or even tested theoretical models linking cognition, motivation, and learning (e.g., Muis et al., 2015). Furthermore, goal-directed and effortful OESI requires metacognitive knowledge and skills, such as current updates of the search task and monitoring of one's process (Barzilai and Chinn, 2018). A few studies have used think-aloud methods to investigate individuals' (epistemic) meta-cognition during online engagement (e.g., Mason et al., 2011; Barzilai and Zohar, 2012). While we think that such approaches should guide future empirical investigations into practices within OESI, our literature review also shows that there is ample research and evidence that future studies may build on.
Furthermore, our heuristic model of OESI could be extended in the future to include a larger variety of online information. Information technologies are constantly changing and with them users' access to information (e.g., on different devices, in different apps), information formats (e.g., interactive representations and video), information design (e.g., the use of nudges), and distribution (e.g., by algorithms, artificial intelligence). Hence, engagement with online information (and dealing with new and unique constraints and affordances) might already or will in the future involve even more steps, strategies, or skills (as well as many more variables mediating their effortful execution) than we have discussed in this review. Research on users' cognition and behavior in dealing with online scientific information—and especially on communication formats beyond informational text—is still sparse, but is growing in different disciplines (e.g., psychology, educational sciences, communication science, information sciences). We hope that future research would strive toward further integration of theoretical ideas and models within and across disciplinary bounds.
Author Contributions
FH and EM-P conceived and presented the idea and structure of the article. FH took the lead in writing the manuscript. EM-P, MF, KI, and MZ each engaged in writing sections of the article. All authors provided feedback and ideas.
Conflict of Interest
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Acknowledgments
The idea for this article is the result of a workshop in June 2019 that received funding from two special interest groups (SIG) of the European Association for Research on Learning and Instruction (EARLI). We would like to thank the coordinators of the EARLI SIGs Inquiry Learning and Argumentation, Dialogue, and Reasoning, and the participants of the workshop. We furthermore thank Celeste Brennecka for language editing. We acknowledge support from the Open Access Publication Fund of the University of Muenster.
References
Alexander, P. A., and The Disciplined Reading and Learning Research Laboratory (2012). Reading into the future: competence for the 21st century. Educ. Psychol. 47, 259–280. doi: 10.1080/00461520.2012.722511
Anderson, A. A., Brossard, D., Scheufele, D. A., Xenos, M. A., and Ladwig, P. (2014). The “nasty effect:” online incivility and risk perceptions of emerging technologies. J. Comput. Commun. 19, 373–387. doi: 10.1111/jcc4.12009
Ariel, Y., and Avidar, R. (2015). Information, interactivity, and social media. Atlan. J. Commun. 23, 19–30. doi: 10.1080/15456870.2015.972404
Asterhan, C. S. C. (2013). “Epistemic and interpersonal dimensions of peer argumentation: conceptualization and quantitative assessment,” in Affective Learning Together, eds M. Baker, J. Andriessen and S. Jarvela (New York, NY: Routledge, Advances in Learning and Instruction series), 251–271
Asterhan, C. S. C., and Schwarz, B. B. (2016). Argumentation for learning: well-trodden paths and unexplored territories. Educ. Psychol. 51, 164–187. doi: 10.1080/00461520.2016.1155458
Baek, Y. M., Wojcieszak, M., and Delli Carpini, M. X. (2012). Online versus face-to-face deliberation: Who? Why? What? With what effects? New Media Soc. 14, 363–383. doi: 10.1177/1461444811413191
Baram-Tsabari, A., and Schejter, A. M. (2019). “New media: a double-edged sword in support of Public Engagement with Science,” in Learning In a Networked Society, eds Y. Kali, A. Baram-Tsabari, and A. M. Schejter (Cham: Springer), 79–95. doi: 10.1007/978-3-030-14610-8_5
Barzilai, S., and Chinn, C. A. (2018). On the goals of epistemic education: promoting apt epistemic performance. J. Learn. Sci. 27, 353–389. doi: 10.1080/10508406.2017.1392968
Barzilai, S., and Chinn, C. A. (2019). “Epistemic thinking in a networked society: contemporary challenges and educational responses,” in Learning in a Networked Society, eds Y. Kali, A. Baram-Tsabari, and A. M. Schejter (Cham: Springer). doi: 10.1007/978-3-030-14610-8_4
Barzilai, S., and Eshet-Alkalai, Y. (2015). The role of epistemic perspectives in comprehension of multiple author viewpoints. Learn Instr. 36, 86–103. doi: 10.1016/j.learninstruc.2014.12.003
Barzilai, S., and Ka'adan, I. (2017). Learning to integrate divergent information sources: the interplay of epistemic cognition and epistemic metacognition. Metacogn. Learn. 12, 193–232. doi: 10.1007/s11409-016-9165-7
Barzilai, S., and Zohar, A. (2012). Epistemic thinking in action: evaluating and integrating online sources. Cogn. Instr. 30, 39–85. doi: 10.1080/07370008.2011.636495
Barzilai, S., and Zohar, A. (2016). “Epistemic (meta)cognition: ways of thinking about knowledge and knowing,” in Handbook of Epistemic Cognition, eds J. A. Greene, W. A. Sandoval, and I. Bråten (London: Routledge), 409–424
Bender, J., Rothmund, T., Nauroth, P., and Gollwitzer, M. (2016). How moral threat shapes laypersons' engagement with science. Pers. Soc. Psychol. Bull. 42, 1723–1735. doi: 10.1177/0146167216671518
Berkowitz, M. W., and Gibbs, J. C. (1983). Measuring the developmental features of moral discussion. Merrill-Palmer Q. 29, 399–410
Berland, L. K., and Reiser, B. J. (2009). Making sense of argumentation and explanation. Sci. Educ. 93, 26–55. doi: 10.1002/sce.20286
Boykoff, M. T., and Boykoff, J. M. (2004). Balance as bias: global warming and the US prestige press. Glob. Environ. Chang. 14, 125–136. doi: 10.1016/j.gloenvcha.2003.10.001
Bråten, I., Anmarkrud, Ø., Brandmo, C., and Strømsø, H. I. (2014a). Developing and testing a model of direct and indirect relationships between individual differences, processing, and multiple-text comprehension. Learn. Instr. 30, 9–24. doi: 10.1016/j.learninstruc.2013.11.002
Bråten, I., Brante, E. W., and Strømsø, H. I. (2018). What really matters: the role of behavioural engagement in multiple document literacy tasks. J. Res. Read. 41, 680–699. doi: 10.1111/1467-9817.12247
Bråten, I., Ferguson, L. E., Anmarkrud, Ø., Strømsø, H. I., and Brandmo, C. (2014b). Modeling relations between students' justification for knowing beliefs in science, motivation for understanding what they read in science, and science achievement. Int. J. Educ. Res. 66, 1–12. doi: 10.1016/j.ijer.2014.01.004
Bråten, I., Salmerón, L., and Strømsø, H. I. (2016). Who said that? Investigating the plausibility-induced source focusing assumption with norwegian undergraduate readers. Contemp. Educ. Psychol. 46, 253–262. doi: 10.1016/j.cedpsych.2016.07.004
Bråten, I., and Strømsø, H. I. (2006). Epistemological beliefs, interest, and gender as predictors of Internet-based learning activities. Comput. Hum. Behav. 22, 1027–1042. doi: 10.1016/j.chb.2004.03.026
Bråten, I., and Strømsø, H. I. (2010). Effects of task instruction and personal epistemology on the understanding of multiple texts about climate change. Disc. Proc. 47, 1–31. doi: 10.1080/01638530902959646
Bråten, I., Strømsø, H. I., and Salmerón, L. (2011). Trust and mistrust when students read multiple information sources about climate change. Learn. Instr. 21, 180–192. doi: 10.1016/j.learninstruc.2010.02.002
Braasch, J. L. G., and Bråten, I. (2017). the discrepancy-induced source comprehension (d-ISC) model: basic assumptions and preliminary evidence. Educ. Psychol. 52, 167–181. doi: 10.1080/00461520.2017.1323219
Braasch, J. L. G., Rouet, J.-F., Vibert, N., and Britt, M. A. (2012). Readers' use of source information in text comprehension. Mem. Cognit. 40, 450–465. doi: 10.3758/s13421-011-0160-6
Brante, E. W., and Strømsø, H. I. (2017). Sourcing in text comprehension: a review of interventions targeting sourcing skills. Educ. Psychol. Rev. 30, 773–799. doi: 10.1007/s10648-017-9421-7
Breakstone, J., McGrew, S., Smith, M., Ortega, T., and Wineburg, S. (2018). Teaching students to navigate the online landscape. Soc. Educ. 82, 219–221. Available online at: https://www.ingentaconnect.com/content/ncss/se/2018/00000082/00000004/art00010#expand/collapse
Britt, M. A., and Aglinskas, C. (2002). Improving students' ability to identify and use source information. Cogn. Instr. 20, 485–522. doi: 10.1207/S1532690XCI2004_2
Britt, M. A., Rouet, J. F., Blaum, D., and Millis, K. (2019). A reasoned approach to dealing with fake news. Policy Insights Behav. Brain Sci. 6, 94–101. doi: 10.1177/2372732218814855
Bromme, R., and Goldman, S. R. (2014). The public's bounded understanding of science. Educ. Psychol. 49, 59–69. doi: 10.1080/00461520.2014.921572
Bromme, R., Kienhues, D., and Porsch, T. (2010). “Who knows what and who can we believe? Epistemological beliefs are beliefs about knowledge (mostly) attained from others,” in Personal Epistemology in the Classroom: Theory, Research, and Implications for Practice, eds L. D. Bendixen and F. C. Feucht (Cambridge: Cambridge University Press), 163–194. doi: 10.1017/CBO9780511691904.006
Brossard, D., and Scheufele, D. A. (2013). Science, new media, and the public. Science 339, 40–41. doi: 10.1126/science.1232329
Cappella, J. N., Kim, H. S., and Albarracín, D. (2015). Selection and transmission processes for information in the emerging media environment: psychological motives and message characteristics. Media Psychol. 18, 396–424. doi: 10.1080/15213269.2014.941112
Chi, M. T. H. (2009). Active-constructive-interactive: a conceptual framework for differentiating learning activities. Top. Cogn. Sci. 1, 73–105. doi: 10.1111/j.1756-8765.2008.01005.x
Chinn, C. A., and Brewer, W. F. (1998). An empirical test of a taxonomy of responses to anomalous data in science. J. Res. Sci. Teach. 35, 623–654. doi: 10.1002/(SICI)1098-2736(199808)35:6<623::AID-TEA3>3.0.CO;2-0
Chinn, C. A., Buckland, L. A., and Samarapungavan, A. (2011). Expanding the dimensions of epistemic cognition: arguments from philosophy and psychology. Educ. Psychol. 46, 141–167. doi: 10.1080/00461520.2011.587722
Chinn, C. A., and Osborne, J. (2008). Student's questions: a potential resource for teaching and learning science. Stud. Sci. Educ. 44:1, 1–39. doi: 10.1080/03057260701828101
Chinn, C. A., Rinehart, R. W., and Buckland, L. A. (2014). “Epistemic cognition and evaluating information: applying the AIR model of epistemic cognition,” in Processing Inaccurate Information: Theoretical and Applied Perspectives from Cognitive Science and the Educational Sciences, eds D. N. Rapp and J. L. G. Braasch (Cambridge, MA: MIT Press), 425–453
Cho, B.-Y., and Afflerbach, P. (2017). “An evolving perspective of constructively responsive reading comprehension strategies in multilayered digital text environments,” in Handbook of Research on Reading Comprehension, eds S. E. Israel (New York, NY: Guilford Press), 109–134.
Coiro, J. (2015). “Purposeful, Critical, and Flexible. Vital Dimensions of Online Reading and Learning,” in Reading at a crossroads? Disjunctures and continuities in current conceptions and practice, eds R. J. Spiro, M. DeSchrvyer, M. S. Morsink, P. M. Hagerman, and P. Thompson (New York, NY: Routledge), 53–64.
Coiro, J. (2020). Toward a multifaceted heuristic of digital reading to inform assessment, research, practice, and policy. Read. Res. Q. doi: 10.1002/rrq.302. [Epub ahead of print].
Coiro, J., Coscarelli, C., Maykel, C., and Forzani, E. (2015). Investigating criteria that seventh graders use to evaluate the quality of online information. J. Adolesc. Adult Lit. 59, 287–297. doi: 10.1002/jaal.448
Concannon, S., Healey, P. G., and Purver, M. (2015). Taking a Stance: A Corpus Study of Reported Speech. Available online at: https://qmro.qmul.ac.uk/xmlui/bitstream/handle/123456789/9041/CONCANNONShiftingopinions2015Published.pdf?sequence=2 (accessed April 19, 2020).
Corner, A., and Hahn, U. (2009). Evaluating science arguments: evidence, uncertainty, and argument strength. J. Exp. Psychol. Appl. 15, 199–212. doi: 10.1037/a0016533
Crowell, A., and Kuhn, D. (2014). Developing dialogic argumentation skills: a 3-year intervention study. J. Cogn. Dev. 15, 363–381. doi: 10.1080/15248372.2012.725187
Deci, E. L. (1992). “The relation of interest to the motivation of behavior: a self-determination theory perspective,” in The Role of Interest in Learnig and Development, eds K. A. Renninger, S. Hidi, and A. Krapp (Mahwah, NJ: Lawrence Erlbaum Associates, Inc), 43–70.
Dieckmann, N. F., Johnson, B. B., Gregory, R., Mayorga, M., Han, P. K. J., and Slovic, P. (2017). Public perceptions of expert disagreement: bias and incompetence or a complex and random world? Public Underst. Sci. 26, 325–338. doi: 10.1177/0963662515603271
Dragojevic, M., Gasiorek, J., and Giles, H. (2016). “Accommodative Strategies as Core of the Theory,” in Communication Accommodation Theory, ed. H. Giles (Cambridge: Cambridge University Press), 36–59. doi: 10.1017/CBO9781316226537.003
Dubovi, I., and Tabak, I. (2020). An empirical analysis of knowledge co-construction in YouTube comments. Comput. Educ. 156:103939. doi: 10.1016/j.compedu.2020.103939
Dunwoody, S., and Griffin, R. J. (2015). “Risk Information Seeking and Processing Model,” in The SAGE Handbook of Risk Communication, eds H. Cho, T. Reimer, and K. A. McComas (Thousand Oaks, CA: SAGE Publications, Inc.), 102–116. doi: 10.4135/9781483387918.n14
Eccles, J., and Wang, M.-T. (2012). “Part I Commentary: So what is student engagement anyway?,” in Handbook of Research on Student Engagement, eds S. L. Christenson, A. L. Reschly, and C. Wyli (New York, NY: Springer US) 133–145. doi: 10.1007/978-1-4614-2018-7_6
Elgesem, D., Steskal, L., and Diakopoulos, N. (2015). Structure and content of the discourse on climate change in the blogosphere: the big picture. Environ. Commun. 9, 169–188. doi: 10.1080/17524032.2014.983536
Evans, S. K., Pearce, K. E., Vitak, J., and Treem, J. W. (2016). Explicating affordances: a conceptual framework for understanding affordances in communication research. J. Comput. Mediat. Commun. 22, 35–52. doi: 10.1111/jcc4.12180
Feinstein, N. W., and Waddington, D. I. (2020). Individual truth judgments or purposeful, collective sensemaking? Rethinking science education's response to the post-truth era. Educ. Psychol. 55, 155–166. doi: 10.1080/00461520.2020.1780130
Felton, M., Crowell, A., Garcia-Mila, M., and Villarroel, C. (2019). Capturing deliberative argument: an analytic coding scheme for studying argumentative dialogue and its benefits for learning. Learn. Cult. Soc. Interact. doi: 10.1016/j.lcsi.2019.100350. [Epub ahead of print].
Felton, M., Crowell, A., and Liu, T. (2015a). Arguing to agree: mitigating my-side bias through consensus-seeking dialogue. Writ. Commun. 32, 317–331. doi: 10.1177/0741088315590788
Felton, M., Garcia-Mila, M., Villarroel, C., and Gilabert, S. (2015b). Arguing collaboratively: argumentative discourse types and their potential for knowledge building. Br. J. Educ. Psychol. 85, 372–386. doi: 10.1111/bjep.12078
Ferguson, L. E., and Bråten, I. (2013). Student profiles of knowledge and epistemic beliefs: changes and relations to multiple-text comprehension. Learn Instr. 25:49e61. doi: 10.1016/j.learninstruc.2012.11.003
Ferguson, L. E., Bråten, I., and Strømsø, H. I. (2012). Epistemic cognition when students read multiple documents containing conflicting scientific evidence: a think-aloud study. Learn Instr. 22, 103–120. doi: 10.1016/j.learninstruc.2011.08.002
Ferretti, R. P., MacArthur, C. A., and Dowdy, N. S. (2000). The effects of an elaborated goal on the persuasive writing of students with learning disabilities and their normally achieving peers. J. Educ. Psychol. 92, 694–702. doi: 10.1037/0022-0663.92.4.694
Fischer, F., Bruhn, J., Gräsel, C., and Mandl, H. (2002). Fostering collaborative knowledge construction with visualization tools. Learn Instr. 12, 213–232. doi: 10.1016/S0959-4752(01)00005-6
Fischer, F., Kollar, I., Ufer, S., Sodian, B., Hussmann, H., Pekrun, R., Dettmers, S., and Trautwein, U. (2014). Scientific reasoning and argumentation: advancing an interdisciplinary research agenda in education. Front. Learn. Res. 4, 28–45. doi: 10.14786/flr.v2i2.96
Fischer, P., Jonas, E., Frey, D., and Schulz-Hardt, S. (2005). Selective exposure to information: the impact of information limits. Eur. J. Soc. Psychol. 35, 469–492. doi: 10.1002/ejsp.264
Fisher, M., Goddu, M. K., and Keil, F. C. (2015). Searching for explanations: how the Internet inflates estimates of internal knowledge. J. Exp. Psychol. Gen. 144, 674–687. doi: 10.1037/xge0000070
Fisher, M., Knobe, J., Strickland, B., and Keil, F. C. (2017). The influence of social interaction on intuitions of objectivity and subjectivity. Cogn. Sci. 41, 1119–1134. doi: 10.1111/cogs.12380
Flemming, D., Feinkohl, I., Cress, U., and Kimmerle, J. (2015). Individual uncertainty and the uncertainty of science: the impact of perceived conflict and general self-efficacy on the perception of tentativeness and credibility of scientific information. Front. Psychol. 6:1859. doi: 10.3389/fpsyg.2015.01859
Flemming, D., Kimmerle, J., Cress, U., and Sinatra, G. M. (2020). Research is tentative, but that's okay: overcoming misconceptions about scientific tentativeness through refutation texts. Discl. Process. 57, 17–35. doi: 10.1080/0163853X.2019.1629805
Forzani, E. (2019). A three-tiered framework for proactive critical evaluation during online inquiry. J. Adolesc. Adult Lit. 63, 401–414. doi: 10.1002/jaal.1004
Friedman, S. M., Dunwoody, S., and Rogers, C. L. (1999). Communicating Uncertainty: Media Coverage of New and Controversial Science. New York, NY: Routledge. doi: 10.4324/9781410601360
Gao, F., Luo, T., and Zhang, K. (2012). Tweeting for learning: a critical analysis of research on microblogging in education published in 2008-2011. Br. J. Educ. Technol. 43, 783–801. doi: 10.1111/j.1467-8535.2012.01357.x
Gierth, L., and Bromme, R. (2020). Beware of vested interests: epistemic vigilance improves reasoning about scientific evidence (for some people). PLoS ONE 15:e0231387. doi: 10.1371/journal.pone.0231387
Golan Duncan, R., Chinn, C. A., and Barzilai, S. (2018). Grasp of evidence: problematizing and expanding the next generation science standards' conceptualization of evidence. J. Reseach Sci. Teach. 55, 907–937. doi: 10.1002/tea.21468
Goldman, S. R., and Scardamalia, M. (2013). Managing, understanding, applying, and creating knowledge in the information age: next-generation challenges and opportunities. Cogn. Instr. 31, 255–269. doi: 10.1080/10824669.2013.773217
Greene, J. A., Copeland, D. Z., and Deekens, V. M. (2020). A model of technology incidental learning effects. Educ. Psychol. Rev. doi: 10.1007/s10648-020-09575-5. [Epub ahead of print].
Greving, H., and Sassenberg, K. (2015). Counter-regulation online: threat biases retrieval of information during Internet search. Comput. Hum. Behav. 50, 291–298. doi: 10.1016/j.chb.2015.03.077
Griffin, R. J., Dunwoody, S., and Neuwirth, K. (1999). Proposed model of the relationship of risk information seeking and processing to the development of preventive behaviors. Environ. Res. 80, 230–245. doi: 10.1006/enrs.1998.3940
Guthrie, J. T., and Klauda, S. L. (2016). “Engagement and Motivational Processes in Reading,” in Handbook of Individual Differences in Reading: Reader, Text, and Context, ed. P. Afflerbach (New York, NY: Routledge), 41–53
Haas, A., and Unkel, J. (2017). Ranking versus reputation: perception and effects of search result credibility. Behav. Inf. Technol. 36, 1285–1298. doi: 10.1080/0144929X.2017.1381166
Hautala, J., Kiili, C., Kammerer, Y., Loberg, O., Hokkanen, S., and Leppänen, P. H. T. (2018). Sixth graders' evaluation strategies when reading Internet search results: an eye-tracking study. Behav. Inf. Technol. 37, 761–773. doi: 10.1080/0144929X.2018.1477992
Head, A., and Eisenberg, M. B. (2010). Truth be told: how college students evaluate and use information in the digital age. Proj. Inf. Lit. Prog. Rep. 1–72. doi: 10.2139/ssrn.2281485
Hendriks, F., and Kienhues, D. (2019). “Science understanding between scientific literacy and trust: contributions from psychological and educational research,” in Science Communication, eds A. Leßmöllmann, M. Dascal, and T. Gloning (Berlin, Boston, MA: De Gruyter), 29–50. doi: 10.1515/9783110255522-002
Hendriks, F., Kienhues, D., and Bromme, R. (2015). Measuring laypeople's trust in experts in a digital age: the muenster epistemic trustworthiness inventory (METI). PLoS ONE 10:e0139309. doi: 10.1371/journal.pone.0139309
Hendriks, F., Kienhues, D., and Bromme, R. (2016). Evoking vigilance: would you (dis)trust a scientist who discusses ethical implications of research in a science blog? Public Underst. Sci. 25, 992–1008. doi: 10.1177/0963662516646048
Hendriks, F., Kienhues, D., and Bromme, R. (2020). Replication crisis = trust crisis? The effect of successful vs failed replications on laypeople's trust in researchers and research. Public Underst. Sci. 29, 270–288. doi: 10.1177/0963662520902383
Hilligoss, B., and Rieh, S. Y. (2008). Developing a unifying framework of credibility assessment: construct, heuristics, and interaction in context. Inf. Process. Manag. 44, 1467–1484. doi: 10.1016/j.ipm.2007.10.001
Hofer, B. K. (2004). Epistemological understanding as a metacognitive process: thinking aloud during online searching. Educ. Psychol. 39, 43–55. doi: 10.1207/s15326985ep3901_5
Hopkins, J. (2016). “The Concept of Affordances in Digital Media,” in Handbuch Soziale Praktiken und Digitale Alltagswelten, eds H. Friese, G. Rebane, M. Nolden, and M. Schreiter (Wiesbaden: Springer Fachmedien Wiesbaden), 1–8. doi: 10.1007/978-3-658-08460-8_67-1
Hornikx, J. (2008). Comparing the actual and expected persuasiveness of evidence types: how good are lay people at selecting persuasive evidence? Argumentation 22, 555–569. doi: 10.1007/s10503-007-9067-6
Howard, J. (2020). Should You Wear a Mask? US Health Officials Re-examine Guidance Amid Coronavirus Crisis. CNN. Available online at: https://edition.cnn.com/2020/03/31/health/coronavirus-masks-experts-debate/index.html.
Iordanou, K. (2016). Developing epistemological understanding in scientific and social domains through argumentation. Zeitschrift für Pädagogische Psychol. 30, 109–119. doi: 10.1024/1010-0652/a000172
Iordanou, K., and Constantinou, C. P. (2015). Supporting use of evidence in argumentation through practice in argumentation and reflection in the context of SOCRATES learning environment. Sci. Educ. 99, 282–311. doi: 10.1002/sce.21152
Iordanou, K., and Kuhn, D. (2020). Contemplating the opposition: does a personal touch matter? Discl. Process. 57, 343–359. doi: 10.1080/0163853X.2019.1701918
Iordanou, K., Kuhn, D., Matos, F., Shi, Y., and Hemberger, L. (2019a). Learning by arguing. Learn Instr. 63, 101–207. doi: 10.1016/j.learninstruc.2019.05.004
Iordanou, K., Muis, K. R., and Kendeou, P. (2019b). Epistemic perspective and online epistemic processing of evidence: developmental and domain differences. J. Exp. Educ. 87, 531–551. doi: 10.1080/00220973.2018.1482857
Jucks, R., and Paus, E. (2013). Different words for the same concept: learning collaboratively from multiple documents. Cogn. Instr. 31, 227–254. doi: 10.1080/07370008.2013.769993
Jucks, R., and Thon, F. M. (2017). Better to have many opinions than one from an expert? Social validation by one trustworthy source versus the masses in online health forums. Comput. Hum. Behav. 70, 375–381. doi: 10.1016/j.chb.2017.01.019
Kahan, D. M. (2013). Ideology, motivated reasoning, and cognitive reflection. Judgm. Decis. Mak. 8, 407–424. doi: 10.2139/ssrn.2182588
Kammerer, Y., Brand-Gruwel, S., and Jarodzka, H. (2018). The future of learning by searching the web: mobile, social, and multimodal. Front. Learn. Res. 6, 81–91. doi: 10.14786/flr.v6i2.343
Kammerer, Y., and Gerjets, P. (2014). The role of search result position and source trustworthiness in the selection of web search results when using a list or a grid interface. Int. J. Hum.Comput. Interact. 30, 177–191. doi: 10.1080/10447318.2013.846790
Kammerer, Y., Gottschling, S., and Bråten, I. (2020). The role of internet-specific justification beliefs in source evaluation and corroboration during web search on an unsettled socio-scientific issue. J. Educ. Comput. Res. doi: 10.1177/0735633120952731. [Epub ahead of print].
Keil, F. C. (2008). Getting to the truth: grounding incomplete knowledge. Brooklyn Law Rev. 73, 1035–1052. Available online at: https://brooklynworks.brooklaw.edu/blr/vol73/iss3/8
Keil, F. C., and Kominsky, J. F. (2013). Missing links in middle school: developing use of disciplinary relatedness in evaluating internet search results. PLoS ONE 8:e67777. doi: 10.1371/journal.pone.0067777
Kienhues, D., Jucks, R., and Bromme, R. (2020). Sealing the gateways for post-truthism: reestablishing the epistemic authority of science. Educ. Psychol. 55, 144–154. doi: 10.1080/00461520.2020.1784012
Kienhues, D., Thomm, E., and Bromme, R. (2018). “Specificity reloaded: how multiple layers of specificity influence reasoning in science argument evaluation,” in Scientific Reasoning and Argumentation: The Roles of Domain-Specific and Domain-General Knowledge, eds F. Fischer, C. A. Chinn, K. Engelmann, and J. Osborne (London: Taylor and Francis), 251–270. doi: 10.4324/9780203731826-14
Kim, K.-S., Sin, S.-C. J., and Yoo-Lee, E. Y. (2014). Undergraduates' use of social media as information sources. Coll. Res. Libr. 75:4. doi: 10.5860/crl.75.4.442
Kimmerle, J., Flemming, D., Feinkohl, I., and Cress, U. (2015). How laypeople understand the tentativeness of medical research news in the media: an experimental study on the perception of information about deep brain stimulation. Sci. Commun. 37, 173–189. doi: 10.1177/1075547014556541
Kind, P., and Osborne, J. (2017). Styles of scientific reasoning: a cultural rationale for science education? Sci. Educ. 101, 8–31. doi: 10.1002/sce.21251
Knobloch-Westerwick, S., Mothes, C., and Polavin, N. (2020). Confirmation bias, ingroup bias, and negativity bias in selective exposure to political information. Commun. Res. 47, 104–124. doi: 10.1177/0093650217719596
König, L., and Jucks, R. (2019). Hot topics in science communication: aggressive language decreases trustworthiness and credibility in scientific debates. Public Underst. Sci. 28, 401–416. doi: 10.1177/0963662519833903
Koriat, A. (2012). “The subjective confidence in one's knowledge and judgements: some metatheoretical considerations,” in Foundations of Metacognition, eds M. J. Beran, J. Brandl, J. Perner, and J. Proust (Oxford: Oxford University Press), 213–233. doi: 10.1093/acprof:oso/9780199646739.003.0014
Koslowski, B., Marasia, J., Chelenza, M., and Dublin, R. (2008). Information becomes evidence when an explanation can incorporate it into a causal framework. Cogn. Dev. 23, 472–487. doi: 10.1016/j.cogdev.2008.09.007
Krancher, O., Dibbern, J., and Meyer, P. (2018). How social media-enabled communication awareness enhances project team performance. J. Assoc. Inf. Syst. 19, 813–856. doi: 10.17705/1jais.00510
Kuhn, D. (1999). A developmental model of critical thinking. Educ. Res. 28, 16–46. doi: 10.3102/0013189X028002016
Kuhn, D. (2020). Why is reconciling divergent views a challenge? Curr. Dir. Psychol. Sci. 29, 27–32. doi: 10.1177/0963721419885996
Kuhn, D., and Iordanou, K. (2020). “Epistemology as a core dimension of cognitive development,” in Reason, Bias, and Inquiry: New Perspectives from the Crossroads of Epistemology and Psychology, eds D. Dunning and N. Ballantyne (Oxford: Oxford University Press)
Kuhn, D., and Lao, J. (1996). Effects of evidence on attitudes: is polarization the norm? Psychol. Sci. 7, 115–120. doi: 10.1111/j.1467-9280.1996.tb00340.x
Kuhn, D., and Udell, W. (2003). The development of argument skills. Child Dev. 74, 1245–1260. doi: 10.1111/1467-8624.00605
Kuhn, D., and Udell, W. (2007). Coordinating own and other perspectives in argument. Think. Reason. 13, 90–104. doi: 10.1080/13546780600625447
Lai, L., and Farbrot, A. (2014). What makes you click? The effect of question headlines on readership in computer-mediated communication. Soc. Influ. 9, 289–299. doi: 10.1080/15534510.2013.847859
Le Bigot, L., and Rouet, J. F. (2007). The impact of presentation format, task assignment, and prior knowledge on students' comprehension of multiple online documents. J. Lit. Res. 39, 445–470. doi: 10.1080/10862960701675317
Leonardi, P. (2015). Ambient awareness and knowledge acquisition: using social media to learn “Who knows what” and “Who knows whom.” MIS Quart. 39, 747–762. doi: 10.25300/MISQ/2015/39.4.1
Leshner, A. I. (2003). Public engagement with science. Science 299:977. doi: 10.1126/science.299.5609.977
Leu, D. J., Kinzer, C. K., Coiro, J., Castek, J., and Henry, L. A. (2013). “New literacies: a dual-level theory of the changing nature of literacy, instruction, and assessment,” in Theoretical Models and Processes of Reading, eds D. E. Alvermann, N. J. Unrau, and R. B. Ruddell (Newark, DE: International Reading Association), 1150–1181. doi: 10.1598/0710.42
Lewandowsky, S., Cook, J., Fay, N., and Gignac, G. E. (2019). Science by social media: attitudes towards climate change are mediated by perceived social consensus. Mem. Cognit. 47, 1445–1456. doi: 10.3758/s13421-019-00948-y
Lewicki, R. J., Saunders, D. M., and Minton, J. W. (2001). Essentials of Negotiation. New York, NY: McGraw-Hill/Irwin.
List, A., and Alexander, P. A. (2017a). Analyzing and integrating models of multiple text comprehension. Educ. Psychol. 52, 143–147. doi: 10.1080/00461520.2017.1328309
List, A., and Alexander, P. A. (2017b). Cognitive affective engagement model of multiple source use. Educ. Psychol. 52, 182–199. doi: 10.1080/00461520.2017.1329014
List, A., and Alexander, P. A. (2018). Corroborating students' self-reports of source evaluation. Behav. Inf. Technol. 37, 198–216. doi: 10.1080/0144929X.2018.1430849
List, A., Alexander, P. A., and Stephens, L. A. (2017). Trust but verify: examining the association between students' sourcing behaviors and ratings of text trustworthiness. Discl. Process. 54, 83–104. doi: 10.1080/0163853X.2016.1174654
List, A., Grossnickle, E. M., and Alexander, P. A. (2016a). Profiling students' multiple source use by question type. Read. Psychol. 37, 753–797. doi: 10.1080/02702711.2015.1111962
List, A., Grossnickle, E. M., and Alexander, P. A. (2016b). Undergraduate students' justifications for source selection in a digital academic context. J. Educ. Comput. Res. 54, 22–61. doi: 10.1177/0735633115606659
Lombardi, D., Nussbaum, E. M., and Sinatra, G. M. (2016). Plausibility judgments in conceptual change and epistemic cognition. Educ. Psychol. 51, 35–56. doi: 10.1080/00461520.2015.1113134
Macagno, F., and Walton, D. (2012). Presumptions in legal argumentation. Ratio. Juris. 25, 271–300. doi: 10.1111/j.1467-9337.2012.00514.x
Macedo-Rouet, M., Potocki, A., Scharrer, L., Ros, C., Stadtler, M., Salmerón, L., et al. (2019a). How good is this page? Benefits and limits of prompting on adolescents' evaluation of web information quality. Read. Res. Q. 54, 299–321. doi: 10.1002/rrq.241
Macedo-Rouet, M., Salmerón, L., Ros, C., Pérez, A., Stadtler, M., and Rouet, J. F. (2019b). Are frequent users of social network sites good information evaluators? An investigation of adolescents' sourcing abilities / ‘Son los usuarios frecuentes de las redes sociales evaluadores competentes? Un estudio de las habilidades de los adolescentes par. J. Study Educ. Dev. 43, 101–138. doi: 10.1080/02103702.2019.1690849
Maier, J., Richter, T., Nauroth, P., and Gollwitzer, M. (2018). For me or for them: How in-group identification and beliefs influence the comprehension of controversial texts. J. Res. Read. 41, S48–S65. doi: 10.1111/1467-9817.12132
MaKinster, J. G., Beghetto, R. A., and Plucker, J. A. (2002). Why can't I find newton's third law? Case studies of students' use of the Web as a science resource. J. Sci. Educ. Technol. 11, 155–172. doi: 10.1023/A:1014617530297
Mason, L., Ariasi, N., and Boldrin, A. (2011). Epistemic beliefs in action: spontaneous reflections about knowledge and knowing during online information searching and their influence on learning. Learn. Instr. 21, 137–151. doi: 10.1016/j.learninstruc.2010.01.001
Mason, L., Boldrin, A., and Ariasi, N. (2010a). Epistemic metacognition in context: evaluating and learning online information. Metacogn. Learn. 5, 67–90. doi: 10.1007/s11409-009-9048-2
Mason, L., Boldrin, A., and Ariasi, N. (2010b). Searching the Web to learn about a controversial topic: are students epistemically active? Instr. Sci. 38, 607–633. doi: 10.1007/s11251-008-9089-y
Mayweg-Paus, E., and Jucks, R. (2018). Conflicting evidence or conflicting opinions? Two-sided expert discussions contribute to experts' trustworthiness. J. Lang. Soc. Psychol. 37, 203–223. doi: 10.1177/0261927X17716102
Mayweg-Paus, E., and Macagno, F. (2016). How dialogic settings influence evidence use in adolescent students. Zeitschrift für Pädagogische Psychol. 30, 121–132. doi: 10.1024/1010-0652/a000171
Mayweg-Paus, E., Macagno, F., and Kuhn, D. (2016a). Developing argumentation strategies in electronic dialogs: is modeling effective? Discl. Process. 53, 280–297. doi: 10.1080/0163853X.2015.1040323
Mayweg-Paus, E., Thiebach, M., and Jucks, R. (2016b). Let me critically question this!—Insights from a training study on the role of questioning on argumentative discourse. Int. J. Educ. Res. 79, 195–210. doi: 10.1016/j.ijer.2016.05.017
Mayweg-Paus, E., Zimmermann, M., Le, N.T., and Pinkwart, N. (2020). A review of technologies for collaborative online information seeking: on the contribution of collaborative argumentation. Educ. Inf. Technol. doi: 10.1007/s10639-020-10345-7. [Epub ahead of print].
McGrew, S., Breakstone, J., Ortega, T., Smith, M., and Wineburg, S. (2018). Can students evaluate online sources? Learning from assessments of civic online reasoning. Theory Res. Soc. Educ. 46, 165–193. doi: 10.1080/00933104.2017.1416320
Mercer, D. (2018). Why Popper can't resolve the debate over global warming: problems with the uses of philosophy of science in the media and public framing of the science of global warming. Public Underst. Sci. 27, 139–152. doi: 10.1177/0963662516645040
Metzger, M. J., and Flanagin, A. J. (2013). Credibility and trust of information in online environments: the use of cognitive heuristics. J. Pragmat. 59, 210–220. doi: 10.1016/j.pragma.2013.07.012
Molina, M. D., Sundar, S. S., Le, T., and Lee, D. (2019). “Fake News” is not simply false information: a concept explication and taxonomy of online content. Am. Behav. Sci. doi: 10.1177/0002764219878224. [Epub ahead of print].
Muis, K. R. (2007). The role of epistemic beliefs in self-regulated learning. Educ. Psychol. 42, 173–190. doi: 10.1080/00461520701416306
Muis, K. R., Pekrun, R., Sinatra, G. M., Azevedo, R., Trevors, G., Meier, E., et al. (2015). The curious case of climate change: testing a theoretical model of epistemic beliefs, epistemic emotions, and complex learning. Learn. Instr. 39, 168–183. doi: 10.1016/j.learninstruc.2015.06.003
National Research Council (2012). A Framework for K-12 Science Education. Practices, Crosscutting Concepts, and Core Ideas. Washington, DC: National Research Council.
National Science Board (2018). Science and Engineering Indicators 2018. NSB-2018-1. Alexandria, VA: National Science Foundation.
Nauroth, P., Gollwitzer, M., Bender, J., and Rothmund, T. (2015). Social identity threat motivates science-discrediting online comments. PLoS ONE 10:e0117476. doi: 10.1371/journal.pone.0117476
Nauroth, P., Gollwitzer, M., Kozuchowski, H., Bender, J., and Rothmund, T. (2017). The effects of social identity threat and social identity affirmation on laypersons' perception of scientists. Public Underst. Sci. 26, 754–770. doi: 10.1177/0963662516631289
Noroozi, O., Weinberger, A., Biemans, H. J., Mulder, M., and Chizari, M. (2012). Argumentation-based computer supported collaborative learning (ABCSCL). A synthesis of 15years of research. Educ. Res. Rev. 7, 79–106. doi: 10.1016/j.edurev.2011.11.006
Nussbaum, E. M., and Edwards, O. V. (2011). Critical questions and argument stratagems: a framework for enhancing and analyzing students' reasoning practices. J. Learn. Sci. 20, 443–488. doi: 10.1080/10508406.2011.564567
Nussbaum, E. M., and Kardash, C. M. (2005). The effects of goal instructions and text on the generation of counterarguments during writing. J. Educ. Psychol. 97:157. doi: 10.1037/0022-0663.97.2.157
Oeldorf-Hirsch, A., and DeVoss, C. L. (2020). Who posted that story? Processing layered sources in facebook news posts. J. Mass Commun. Q. 97, 141–160. doi: 10.1177/1077699019857673
Oreskes, N. (2007). “The scientific consensus on climate change: How do we know we're not wrong,” in Climate Change: What It Means for Us, Our Children, and Our Grandchildren, eds J. F. C. DiMento and P. Doughman (Cambridge, MA: MIT Press), 65–99
Oreskes, N., and Conway, E. M. (2011). Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming. New York, NY: Bloomsbury Press.
Osborne, J. (2010). Arguing to learn in science: the role of collaborative, critical discourse. Science 328, 463–466. doi: 10.1126/science.1183944
Osborne, J., Erduran, S., and Simon, S. (2004). Enhancing the quality of argumentation in school science. J. Res. Sci. Teach. 41, 994–1020. doi: 10.1002/tea.20035
Pan, B., Hembrooke, H., Joachims, T., Lorigo, L., Gay, G., and Granka, L. (2007). In google we trust: users' decisions on rank, position, and relevance. J. Comput. Commun. 12, 801–823. doi: 10.1111/j.1083-6101.2007.00351.x
Paul, J., Stadtler, M., and Bromme, R. (2019). Effects of a sourcing prompt and conflicts in reading materials on elementary students' use of source information. Discl. Process. 56, 155–169. doi: 10.1080/0163853X.2017.1402165
Pavelle, S., and Wilkinson, C. (2020). Into the digital wild: utilizing twitter, instagram, you tube, and facebook for effective science and environmental communication. Front. Commun. 5, 1–8. doi: 10.3389/fcomm.2020.575122
Pintrich, P. R. (2000). “The role of goal orientation in self-regulated learning,” in Handbook of Self-Regulation, eds M. Boekaerts, P. R. Pintrich, and M. Zeidner (San Diego, CA: Academic Press), 451–502
Rabinovich, A., and Morton, T. A. (2012). Unquestioned answers or unanswered questions: beliefs about science guide responses to uncertainty in climate change risk communication. Risk Anal. 32, 992–1002. doi: 10.1111/j.1539-6924.2012.01771.x
Rabinovich, A., Morton, T. A., and Birney, M. E. (2012). Communicating climate science: the role of perceived communicator's motives. J. Environ. Psychol. 32, 11–18. doi: 10.1016/j.jenvp.2011.09.002
Rafaeli, S. (1988). “Interactivity: From new media to communication,” in Advancing Communication Science: Merging Mass and Interpersonal Process, eds R. P. Hawkins, J. M. Wiemann, and S. Pingree (Newbury Park, CA: Sage), 110–134
Rafaeli, S., and Ariel, Y. (2007). “Assessing interactivity in computer-mediated research,” in The Oxford handbook of Internet Psychology, eds A. N. Joinson, K. Y. A. McKenna, T. Postmes, and U.-D. Reips (Oxford: Oxford University Press), 71–89.
Rapanta, C., and Christodoulou, A. (2019). Walton's types of argumentation dialogues as classroom discourse sequences. Learn. Cult. Soc. Interact. doi: 10.1016/j.lcsi.2019.100352. [Epub ahead of print].
Redfors, A., Hansson, L., Kyza, E. A., Nicolaidou, I., Asher, I., Tabak, I., et al. (2014). “CoReflect: web-based inquiry learning environments on socio-scientific Issues,” in Topics and Trends in Current Science Education, eds C. Bruguière, A. Tiberghien, and P. Clément (Dordrecht: Springer), 553–566. doi: 10.1007/978-94-007-7281-6_34
Reif, A., Kneisel, T., Schäfer, M., and Taddicken, M. (2020). Why are scientific experts perceived as trustworthy? Emotional assessment within tv and youtube videos. Media Commun. 8, 191–205. doi: 10.17645/mac.v8i1.2536
Richter, T. (2015). Validation and comprehension of text information: two sides of the same coin. Discl. Process. 52, 337–355. doi: 10.1080/0163853X.2015.1025665
Rieh, S. Y. (2002). Judgment of information quality and cognitive authority in the Web. J. Am. Soc. Inf. Sci. Technol. 53, 141–161. doi: 10.1002/asi.10017
Risko, E. F., Ferguson, A. M., and McLean, D. (2016). On retrieving information from external knowledge stores: feeling-of-findability, feeling-of-knowing and Internet search. Comput. Human Behav. 65, 534–543. doi: 10.1016/j.chb.2016.08.046
Rodriguez, F., Rhodes, R. E., Miller, K. F., and Shah, P. (2016). Examining the influence of anecdotal stories and the interplay of individual differences on reasoning. Think. Reason. 22, 274–296. doi: 10.1080/13546783.2016.1139506
Rothmund, T., Bender, J., Nauroth, P., and Gollwitzer, M. (2015). Public concerns about violent video games are moral concerns-how moral threat can make pacifists susceptible to scientific and political claims against violent video games. Eur. J. Soc. Psychol. 45, 769–783. doi: 10.1002/ejsp.2125
Rouet, J.-F., and Britt, M. A. (2011). “Relevance processes in multiple document comprehension,” in Text Relevance and Learning from Text, eds M. T. McCrudden, J. P. Magliano, and G. Schraw (Greenwich, CT: Information Age), 19–52
Rouet, J.-F., Britt, M. A., and Durik, A. M. (2017). RESOLV: readers' representation of reading contexts and tasks. Educ. Psychol. 52, 200–215. doi: 10.1080/00461520.2017.1329015
Rouet, J.-F., Ros, C., Goumi, A., Macedo-Rouet, M., and Dinet, J. (2011). The influence of surface and deep cues on primary and secondary school students' assessment of relevance in Web menus. Learn. Instr. 21, 205–219. doi: 10.1016/j.learninstruc.2010.02.007
Salmerón, L., Cañas, J. J., and Fajardo, I. (2005). Are expert users always better searchers? Interaction of expertise and semantic grouping in hypertext search tasks. Behav. Inf. Technol. 24, 471–475. doi: 10.1080/0144329042000320018
Salmerón, L., Gil, L., and Bråten, I. (2017). Effects of reading real versus printout versions of multiple documents on students' sourcing and integrated understanding. Contemp. Educ. Psychol. 52, 25–35. doi: 10.1016/j.cedpsych.2017.12.002
Salmerón, L., Gil, L., Bråten, I., and Strømsø, H. (2010). Comprehension effects of signalling relationships between documents in search engines. Comput. Human Behav. 26, 419–426. doi: 10.1016/j.chb.2009.11.013
Salmerón, L., Kammerer, Y., and García-Carrión, P. (2013). Searching the web for conflicting topics: page and user factors. Comput. Hum. Behav. 29, 2161–2171. doi: 10.1016/j.chb.2013.04.034
Sandoval, W. A., Greene, J. A., and Bråten, I. (2016). Understanding and promoting thinking about knowledge: origins, issues, and future directions of research on epistemic cognition. Rev. Res. Educ. 40, 457–496. doi: 10.3102/0091732X16669319
Scharrer, L., Bromme, R., Britt, M. A., and Stadtler, M. (2012). The seduction of easiness: how science depictions influence laypeople's reliance on their own evaluation of scientific information. Learn. Instr. 22, 231–243. doi: 10.1016/j.learninstruc.2011.11.004
Scharrer, L., Rupieper, Y., Stadtler, M., and Bromme, R. (2017). When science becomes too easy: science popularization inclines laypeople to underrate their dependence on experts. Public Underst. Sci. 26, 1003–1018. doi: 10.1177/0963662516680311
Scharrer, L., Stadtler, M., and Bromme, R. (2019). Biased recipients encounter biased sources: effect of ethical (dis-)agreement between recipient and author on evaluating scientific claims. Appl. Cogn. Psychol. 33, 1165–1177. doi: 10.1002/acp.3563
Scheufele, D. A., and Krause, N. M. (2019). Science audiences, misinformation, and fake news. Proc. Natl. Acad. Sci. U.S.A. 116, 7662–7669. doi: 10.1073/pnas.1805871115
Schiefele, U. (2009). “Situational and individual interest,” in Handbook of Motivation at School, eds K. Wentzel and A. Wigfield (New York, NY: Routledge), 197–222
Schulz-Hardt, S., Brodbeck, F. C., Mojzisch, A., Kerschreiter, R., and Frey, D. (2006). Group decision making in hidden profile situations: Dissent as a facilitator for decision quality. J. Pers. Soc. Psychol. 91, 1080–1093. doi: 10.1037/0022-3514.91.6.1080
Schulz-Hardt, S., Jochims, M., and Frey, D. (2002). Productive conflict in group decision making: genuine and contrived dissent as strategies to counteract biased information seeking. Organ. Behav. Hum. Decis. Process. 88, 563–586. doi: 10.1016/S0749-5978(02)00001-8
Shah, P., Michal, A., Ibrahim, A., Rhodes, R., and Rodriguez, F. (2017). What makes everyday scientific reasoning so challenging? Psychol. Learn. Motiv. 66, 251–299. doi: 10.1016/bs.plm.2016.11.006
Shapiro, A., and Niederhauser, D. (2004). “Learning from hypertext: research issues and findings,” in Handbook of Research on Educational Communications and Technology, ed. D. H. Jonassen (Mahwah, NJ: Erlbaum), 605–620
Sharon, A. J., and Baram-Tsabari, A. (2020). Can science literacy help individuals identify misinformation in everyday life? Sci. Educ. 104, 873–894. doi: 10.1002/sce.21581
Sinatra, G. M., and Lombardi, D. (2020). Evaluating sources of scientific evidence and claims in the post-truth era may require reappraising plausibility judgments. Educ. Psychol. 55, 120–131. doi: 10.1080/00461520.2020.1730181
Sinatra, G. M., Southerland, S. A., McConaughy, F., and Demastes, J. W. (2003). Intentions and beliefs in students' understanding and acceptance of biological evolution. J. Res. Sci. Teach. 40, 510–528. doi: 10.1002/tea.10087
Song, Y., and Ferretti, R. P. (2013). Teaching critical questions about argumentation through the revising process: effects of strategy instruction on college students' argumentative essays. Read. Writ. 26, 67–90. doi: 10.1007/s11145-012-9381-8
Southerland, S. A., and Sinatra, G. M. (2006). “The shifting roles of acceptance and dispositions in understanding biological evolution,” in Beyond Cartesian Dualism, eds W. W. Cobern, K. Tobin, H. Brown-Acquay, M. Espinet, G. Irzik, O. Jegede, et al. (Berlin; Heidelberg: Springer), 69–78. doi: 10.1007/1-4020-3808-9_6
Sparrow, B., Liu, J., and Wegner, D. M. (2011). Google effects on memory: cognitive consequences of having information at our fingertips. Science 333, 776–778. doi: 10.1126/science.1207745
Sperber, D., Clément, F., Heintz, C., Mascaro, O., Mercier, H., Origgi, G., et al. (2010). Epistemic vigilance. Mind Lang. 25, 359–393. doi: 10.1111/j.1468-0017.2010.01394.x
Stadtler, M. (2017). The art of reading in a knowledge society: commentary on the special issue on models of multiple text comprehension. Educ. Psychol. 52, 225–231. doi: 10.1080/00461520.2017.1322969
Stadtler, M., and Bromme, R. (2014). “The content–source integration model: a taxonomic description of how readers comprehend conflicting scientific information,” in Processing Inaccurate Information: Theoretical and Applied Perspectives from Cognitive Science and Educational Sciences, eds D. N. Rapp and J. L. G. Braasch (Cambridge, MA: MIT Press), 379–402
Stadtler, M., Scharrer, L., Brummernhenrich, B., and Bromme, R. (2013). Dealing with uncertainty: Readers' memory for and use of conflicting information from science texts as function of presentation format and source expertise. Cogn. Instr. 31, 130–150. doi: 10.1080/07370008.2013.769996
Stang Lund, E., Bråten, I., Brandmo, C., Brante, E. W., and Strømsø, H. I. (2019). Direct and indirect effects of textual and individual factors on source-content integration when reading about a socio-scientific issue. Read. Writ. 32, 335–356. doi: 10.1007/s11145-018-9868-z
Strømsø, H. I., Bråten, I., and Britt, M. A. (2011). Do students' beliefs about knowledge and knowing predict their judgement of texts' trustworthiness? Educ. Psychol. 31, 177–206. doi: 10.1080/01443410.2010.538039
Strømsø, H. I., Bråten, I., Britt, M. A., and Ferguson, L. E. (2013). Spontaneous sourcing among students reading multiple documents. Cogn. Instr. 31, 176–203. doi: 10.1080/07370008.2013.769994
Strømsø, H. I., Bråten, I., and Stenseth, T. (2017). The role of students' prior topic beliefs in recall and evaluation of information from texts on socio-scientific issues. Nord. Psychol. 69, 127–142. doi: 10.1080/19012276.2016.1198270
Sundar, S. S. (2008). “The MAIN model: a heuristic approach to understanding technology effects on credibility,” in Digital Media, Youth, and Credibility, eds M. J. Metzger and A. J. Flanagin (Cambridge, MA: MIT Press), 73–100
Tabak, I. (2015). “Functional scientific literacy: seeing the science within the words and across the web,” in Handbook of Educational Psychology, eds L. Corno and E. M. Anderman (New York, NY: Routledge), 269–280.
Taraborelli, D., (2008). “How the Web is changing the way we trust,” in Current Issues in Computing and Philosophy, eds K. Waelbers, A. Briggle, and P. Brey (IOS Press, Amsterdam), 194–204.
Taylor, R. M. (2016). Open-mindedness: an intellectual virtue in the pursuit of knowledge and understanding. Educ. Theory 66, 599–618. doi: 10.1111/edth.12201
Teasley, S. D. (1997). “Talking about reasoning: how important is the peer in peer collaboration?” in Discourse, Tools and Reasoning: Essays on Situated Cognition, eds L. B. Resnick, R. Säljö, C. Pontecorvo, and B. Burge (Berlin: Springer), 361–384. doi: 10.1007/978-3-662-03362-3_16
Thiebach, M., Mayweg-Paus, E., and Jucks, R. (2016). Better to agree or disagree? The role of critical questioning and elaboration in argumentative discourse. Zeitschrift für Pädagogische Psychologie 30, 133–149. doi: 10.1024/1010-0652/a000174
Thomm, E., Barzilai, S., and Bromme, R. (2017). Why do experts disagree? The role of conflict topics and epistemic perspectives in conflict explanations. Learn. Instr. 52, 15–26. doi: 10.1016/j.learninstruc.2017.03.008
Thomm, E., and Bromme, R. (2012). “It should at least seem scientific!” textual features of “scientificness” and their impact on lay assessments of online information. Sci. Educ. 96, 187–211. doi: 10.1002/sce.20480
Thompson, W. B., Garry, A., Taylor, J., and Radell, M. L. (2020). Is one study as good as three? College graduates seem to think so, even if they took statistics classes. Psychol. Learn. Teach. 19, 143–160. doi: 10.1177/1475725719877590
Thon, F. M., and Jucks, R. (2017). Believing in expertise: how authors' credentials and language use influence the credibility of online health information. Health Commun. 32, 828–836. doi: 10.1080/10410236.2016.1172296
Tuler, S. (2000). Forms of talk in policy dialogue: distinguishing between adversarial and collaborative discourse. J. Risk Res. 3, 1–17. doi: 10.1080/136698700376671
van der Bles, A. M., van der Linden, S., Freeman, A. L. J., Mitchell, J., Galvao, A. B., Zaval, L., et al. (2019). Communicating uncertainty about facts, numbers and science. R. Soc. Open Sci. 6:181870. doi: 10.1098/rsos.181870
van der Linden, S. L. D., Leiserowitz, A. A., Feinberg, G. D., and Maibach, E. W. (2015). The scientific consensus on climate change as a gateway belief: experimental evidence. PLoS ONE 10:e0118489. doi: 10.1371/journal.pone.0118489
van Strien, J. L. H., Kammerer, Y., Brand-Gruwel, S., and Boshuizen, H. P. A. (2016). How attitude strength biases information processing and evaluation on the web. Comput. Human Behav. 60, 245–252. doi: 10.1016/j.chb.2016.02.057
Villarroel, C., Felton, M., and Garcia-Mila, M. (2016). Arguing against confirmation bias: the effect of argumentative discourse goals on the use of disconfirming evidence in written argument. Int. J. Educ. Res. 79, 167–179. doi: 10.1016/j.ijer.2016.06.009
Vogl, E., Pekrun, R., Murayama, K., and Loderer, K. (2020). Surprised–curious–confused: epistemic emotions and knowledge exploration. Emotion 20, 625–641. doi: 10.1037/emo0000578
Walton, D. (2010). “Types of dialogue and burdens of proof,” in Computational Models of Argument: Proceedings of COMMA, eds P. Baroni, F. Cerutti, M. Giacomin, and G. R. Simari (Amsterdam: IOS Press), 13–24
Walton, D., and Macagno, F. (2007). The fallaciousness of threats: character and ad baculum. Argumentation 21, 63–81. doi: 10.1007/s10503-006-9018-7
West, R. F., Toplak, M. E., and Stanovich, K. E. (2008). Heuristics and biases as measures of critical thinking: associations with cognitive ability and thinking dispositions. J. Educ. Psychol. 100, 930–941. doi: 10.1037/a0012842
Wineburg, S., and McGrew, S. C. (2019). Lateral reading and the nature of expertise: reading less and learning more when evaluating digital information. Teach. Coll. Rec. 121. Available online at: https://www.tcrecord.org/content.asp?contentid=22806
Winter, S., and Krämer, N. C. (2012). Selecting science information in Web 2.0: how source cues, message sidedness, and need for cognition influence users' exposure to blog posts. J. Comput.-Mediat. Commun. 18, 80–96. doi: 10.1111/j.1083-6101.2012.01596.x
Winter, S., and Krämer, N. C. (2014). A question of credibility - effects of source cues and recommendations on information selection on news sites and blogs. Communications 39, 435–456. doi: 10.1515/commun-2014-0020
Winter, S., Metzger, M. J., and Flanagin, A. J. (2016). Selective use of news cues: a multiple-motive perspective on information selection in social media environments. J. Commun. 66, 669–693. doi: 10.1111/jcom.12241
Wissinger, D. R., and De La Paz, S. (2016). Effects of critical discussions on middle school students' written historical arguments. J. Educ. Psychol. 108:43. doi: 10.1037/edu0000043
Yang, Z. J., and Kahlor, L. A. (2013). What, me worry? The role of affect in information seeking and avoidance. Sci. Commun. 35, 189–212. doi: 10.1177/1075547012441873
Zeidler, D. L. (2014). “Socioscientific issues as curriculum emphasis: theory, research and practice,” in Handbook of Research on Science Education, ed. N. G. Lederman (New York, NY: Routledge), 697–726
Keywords: epistemic cognition, argumentation, scientific literacy, digital literacy, multiple documents literacy, online engagement with scientific information
Citation: Hendriks F, Mayweg-Paus E, Felton M, Iordanou K, Jucks R and Zimmermann M (2020) Constraints and Affordances of Online Engagement With Scientific Information—A Literature Review. Front. Psychol. 11:572744. doi: 10.3389/fpsyg.2020.572744
Received: 15 June 2020; Accepted: 16 November 2020;
Published: 08 December 2020.
Edited by:
Patricia A. Alexander, University of Maryland, United StatesReviewed by:
Stefan Fries, Bielefeld University, GermanyByeong-Young Cho, Hanyang University, South Korea
Copyright © 2020 Hendriks, Mayweg-Paus, Felton, Iordanou, Jucks and Zimmermann. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Friederike Hendriks, f.hendriks@uni-muenster.de