Skip to main content

GENERAL COMMENTARY article

Front. Psychol., 25 July 2017
Sec. Cognition

Commentary: Seeing the conflict: an attentional account of reasoning errors

  • 1Université Paris Descartes, Paris, France
  • 2University of Caen Normandy, Caen, France
  • 3UMR8240 Laboratoire de Psychologie du Développement et de l'Education de L'enfant, Paris, France

A commentary on
Seeing the conflict: an attentional account of reasoning errors

by Mata, A., Ferreira, M. B., Voss, A., and Kollei, T. (2017). Psychon. Bull. Rev. 24, 1–7. doi: 10.3758/s13423-017-1234-7

In a recent study, Mata et al. (2017) test a two-stage account of reasoning that differentiates between the initial interpretation and selection of information within a problem, and the subsequent operation on this information. Crucially, on this account, reasoning errors can result from mistakes at either stage. In previous work, the authors found indirect experimental evidence in support of this account (e.g., Mata et al., 2014), and their most recent experiments address it by analyzing reasoners' on-line attentional resources in situations of conflict. To do so, the authors record eye-tracking measurements as participants evaluate both classical reasoning tasks that induce conflict (variants of the bat-and-ball item from the Cognitive Reflection Test, Frederick, 2005), and comparably structured no-conflict items. Their two-stage account makes at least two strong predictions. All things being equal, correct responders should: (1) attend to conflict problems more than no-conflict items; (2) and they should attend to conflict items more than incorrect responders.

Mata et al. present two studies, the second of which is a replication of the first that avoided a potential confound present in Study 1. In that study, eye-tracking recordings ended when participants indicated that they were ready to respond, raising the concern that crucial aspects of the reasoning process might have been neglected (Ball et al., 2006). By and large, the results of their studies support the claim that correct responders attend more to conflict than no-conflict items, and they allocate more attentional resources to the critical components of the tricky conflict items. Using a technique that is fairly novel in this context (e.g., see Ball et al., 2006, for eye-tracking studies with a different reasoning task; see Ball, 2013 for a review), the authors have generated interesting insights into how it is that reasoners may be led to err, suggesting a largely overlooked attentional path to such mistakes.

However, the authors further interpret their results to indicate that incorrect responders are generally insensitive to conflicts. It is one thing to assert that correct responders do a better job at attending to conflicts than incorrect responders, but it is quite another to claim that incorrect responders are entirely insensitive to such conflicts. In line with Mata et al.'s findings, there is previous experimental literature that suggests that correct responders are more sensitive to conflicts than incorrect responders (e.g., Pennycook et al., 2015). Among incorrect responders, there are prominent individual differences. Within samples of biased responders there are subgroups of participants across various tasks who do, indeed, fail to detect reasoning conflicts. This group tends to range from 15 to 40% of incorrect responders (Mevel et al., 2015; Pennycook et al., 2015; Frey et al., 2017). Yet, the majority of even biased individuals detect conflict across a number of tasks that make use of different measures, suggesting that there is no good empirical reason to universally equate incorrect responding with insensitivity to conflict. A great deal of evidence suggests that incorrect responders are often at least minimally sensitive to conflicts across a wide range of tasks. Surveying this literature, De Neys (2012, 2014) cites evidence of conflict sensitivity on a plurality of diverse tasks (base rate neglect, conjunction fallacy, ratio bias, syllogistic reasoning). More importantly, using the very same task as Mata et al. (2017)—the bat-and-ball problem—different teams have found evidence that incorrect responders detect conflicts on such items (De Neys et al., 2013; Gangemi et al., 2015; Johnson et al., 2016). Since the latter findings rely on different measures (reaction times and confidence/error ratings) than those employed by Mata et al. (2017), one might argue that these contradictory results are a function of the distinct techniques the experiments used. Perhaps the subtler, less invasive eye-tracking results are particularly well positioned to uncover misrepresentations of the conflict premises attributable to early attentional divergences. Moreover, there are dissenting accounts using different techniques. For instance, Travers et al. (2016) find no evidence of conflict detection among incorrect responders using mouse-movement tracking methods.

Because this is somewhat controversial territory, perhaps it is worth considering the actual data Mata et al. (2017) use to draw their conclusion. The authors base their claim for the insensitivity of incorrect responders to reasoning conflict on null results drawn from frequentist statistics in a small sample (n = ~30 in Study 1; and n = 27 in Study 2). The authors briefly note the possibility that their sample of incorrect responders might have been underpowered and therefore incapable of discerning differences in this group. To test this, we ran a Bayesian analysis on their data using Bayes factors. This approach allows us to quantify the degree of support for the null hypothesis (e.g., Wagenmakers, 2007; Masson, 2011; Morey et al., 2014).

Given the potential confound resulting from the non-continuous nature of the eye-tracking in Study 1, we focused on Study 2, and the authors were kind enough to share their subject-averaged results with us. Using the JASP package (JASP Team, 2016), we ran a Bayesian paired samples t-test with default priors (Cauchy prior width = 0.707). We entered the average fixation time and number of revisits on the critical problem premises (sentence 3) on conflict and no-conflict items for the group of incorrect responders in the analyses. Results showed that the Bayes factor in favor of the null hypothesis (no effect of conflict) over the alternative hypothesis (effect of conflict) was 1.34 for fixation time (non-directional; directional alternative hypothesis conflict > no-conflict, Bayes factor = 0.71) and 2.84 for the number of revisits (non-directional; directional alternative hypothesis conflict > no-conflict, Bayes factor = 1.67). These Bayes factor values indicate that there is only weak support in favor of the null hypothesis. Based on the classification of Lee and Wagenmakers (2013), the evidence for the hypothesis that incorrect responders do not distinguish between the conflict and no-conflict versions of the problem can be classified as merely anecdotal. Indeed, the only clear conclusion that we can draw based on these data is that we need more data to support a convincing conclusion in either direction.

In sum, although there is much to like about the Mata et al. (2017) paper, our key point is that the study does not present substantial evidence for the claim that incorrect responders cannot discriminate between conflict and no-conflict problems. Given the prior contradictory findings on the exact same task, we propose that caution is needed when drawing strong conclusions about incorrect responders' insensitivity to conflict on the basis of this study.

Author Contributions

DF wrote the article. BB and WD analyzed the data. WD revised the article.

Funding

This research was supported by a research grant (DIAGNOR, ANR-16-CE28-0010-01) from the Agence National de la Recherche. Additionally, DF is supported by the Sorbonne Paris Cité International Grant (INSPIRE), and BB is supported by the École de Neurosciences Paris (ENP).

Conflict of Interest Statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Acknowledgments

We are grateful to the authors of the original article for sharing their data with us.

References

Ball, L. (2013). “Eye-tracking and reasoning: what your eyes tell about your inferences,” in New Approaches in Reasoning Research, eds W. De Neys and M. Osman (Hove: Psych Press), 51–69.

Google Scholar

Ball, L. J., Phillips, P., Wade, C. N., and Quayle, J. D. (2006). Effects of belief and logic on syllogistic reasoning: eye-movement evidence for selective processing models. Exp. Psychol. 53, 77–86. doi: 10.1027/1618-3169.53.1.77

PubMed Abstract | CrossRef Full Text | Google Scholar

De Neys, W. (2012). Bias and conflict: a case for logical intuitions. Perspect. Psychol. Sci. 7, 28–38. doi: 10.1177/1745691611429354

PubMed Abstract | CrossRef Full Text | Google Scholar

De Neys, W. (2014). Conflict detection, dual processes, and logical intuitions: some clarifications. Think. Reason. 20, 169–187. doi: 10.1080/13546783.2013.854725

CrossRef Full Text | Google Scholar

De Neys, W., Rossi, S., and Houdé, O. (2013). Bats, balls, and substitution sensitivity: cognitive misers are no happy fools. Psychon. Bull. Rev. 20, 269–273. doi: 10.3758/s13423-013-0384-5

PubMed Abstract | CrossRef Full Text | Google Scholar

Frederick, S. (2005). Cognitive reflection and decision making. J. Econ. Perspect. 19, 25–42. doi: 10.1257/089533005775196732

CrossRef Full Text | Google Scholar

Frey, D., Johnson, E. D., and De Neys, W. (2017). Individual differences in conflict detection during reasoning. Q. J. Exp. Psychol. 1–52.

PubMed Abstract | Google Scholar

Gangemi, A., Bourgeois-Gironde, S., and Mancini, F. (2015). Feelings of error in reasoning—in search of a phenomenon. Think. Reason. 21, 383–396. doi: 10.1080/13546783.2014.980755

CrossRef Full Text | Google Scholar

JASP Team (2016). JASP (Version 0.8) [Computer software].

Johnson, E. D., Tubau, E., and De Neys, W. (2016). The Doubting System 1: evidence for automatic substitution sensitivity. Acta Psychol. 164, 56–64. doi: 10.1016/j.actpsy.2015.12.008

PubMed Abstract | CrossRef Full Text | Google Scholar

Lee, M. D., and Wagenmakers, E. J. (2013). Bayesian Data Analysis for Cognitive Science: A Practical Course. New York, NY: Cambridge University Press.

Masson, M. E. (2011). A tutorial on a practical Bayesian alternative to null-hypothesis significance testing. Behav. Res. Methods 43, 679–690. doi: 10.3758/s13428-010-0049-5

PubMed Abstract | CrossRef Full Text | Google Scholar

Mata, A., Ferreira, M. B., Voss, A., and Kollei, T. (2017). Seeing the conflict: an attentional account of reasoning errors. Psychon. Bul. Rev. 24, 1–7. doi: 10.3758/s13423-017-1234-7

CrossRef Full Text | Google Scholar

Mata, A., Schubert, A.-L. B., and Ferreira, M. (2014). The role of language comprehension in reasoning: How “good-enough” representations induce biases. Cognition 133, 457–463. doi: 10.1016/j.cognition.2014.07.011

PubMed Abstract | CrossRef Full Text | Google Scholar

Mevel, K., Poirel, N., Rossi, S., Cassotti, M., Simon, G., Houdé, O., et al. (2015). Bias detection: response confidence evidence for conflict sensitivity in the ratio bias task. J. Cogn. Psychol. 27, 227–237. doi: 10.1080/20445911.2014.986487

CrossRef Full Text | Google Scholar

Morey, R. D., Rouder, J. N., Verhagen, J., and Wagenmakers, E.-J. (2014). Why hypothesis tests are essential for psychological science a comment on Cumming (2014). Psychol. Sci. 25, 1289–1290. doi: 10.1177/0956797614525969

PubMed Abstract | CrossRef Full Text | Google Scholar

Pennycook, G., Fugelsang, J. A., and Koehler, D. J. (2015). What makes us think? A three-stage dual-process model of analytic engagement. Cogn. Psychol. 80, 34–72. doi: 10.1016/j.cogpsych.2015.05.001

PubMed Abstract | CrossRef Full Text | Google Scholar

Travers, E., Rolison, J. J., and Feeney, A. (2016). The time course of conflict on the Cognitive Reflection Test. Cognition 150, 109–118. doi: 10.1016/j.cognition.2016.01.015

PubMed Abstract | CrossRef Full Text | Google Scholar

Wagenmakers, E.-J. (2007). A practical solution to the pervasive problems of p values. Psychon. Bull. Rev. 14, 779–804. doi: 10.3758/BF03194105

PubMed Abstract | CrossRef Full Text | Google Scholar

Keywords: intuition, reasoning, decision making, conflict detection, attention

Citation: Frey DP, Bago B and De Neys W (2017) Commentary: Seeing the conflict: an attentional account of reasoning errors. Front. Psychol. 8:1284. doi: 10.3389/fpsyg.2017.01284

Received: 17 May 2017; Accepted: 13 July 2017;
Published: 25 July 2017.

Edited by:

Ulrich Hoffrage, University of Lausanne, Switzerland

Reviewed by:

Edward J. N. Stupple, University of Derby, United Kingdom
Gordon Pennycook, Yale University, United States

Copyright © 2017 Frey, Bago and De Neys. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Darren P. Frey, darren.frey@gmail.com

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.