Impact Factor 2.089

The world's most-cited Multidisciplinary Psychology journal

General Commentary ARTICLE

Front. Psychol., 09 January 2018 | https://doi.org/10.3389/fpsyg.2017.02329

Commentary: Mental Toughness and Individual Differences in Learning, Educational and Work Performance, Psychological Well-being, and Personality: A Systematic Review

  • School of Physiotherapy and Exercise Science, Curtin University, Perth, WA, Australia

The concept of mental toughness (MT) has garnered substantial interest over the past two decades. Scholars have published several narrative reviews of this literature (e.g., Connaughton et al., 2008; Crust, 2008; Gucciardi, 2017), yet in ~30 years of research there has been no attempt to review this body of work systematically to understand how MT is associated with hypothesized correlates. The systematic review by Lin et al. (2017) was timely for the field of MT (see also, Cowden, 2017). However, in this commentary, I explain two reasons why the conclusions drawn from this systematic review may be misleading or premature.

Methodological Quality of Primary Studies Matters

Well-executed systematic reviews offer many advantages for summarizing, synthesizing and integrating findings across studies when compared with non-systematic evaluations (e.g., clear and accountable methods; see Gough et al., 2017). However, the potential value of even the most well-executed systematic review could be undermined by the methodological quality of the primary studies (Moher et al., 2015; Shamseer et al., 2015; Oliveras et al., 2017). An assessment of methodological quality is necessary both for determining what primary studies might be included in a systematic review and for making inferences regarding the reliability of those studies retained for analysis and integration. For example, differences in the methodological quality of primary studies might explain why research on the same topic results in different or conflicting degrees of evidence. The exclusion of a formal assessment of methodological quality is a major limitation of the systematic review conducted by Lin et al. (2017) because any bias in primary studies transfers to the synthesized evidence unless those biases and sources are variation are handled as part of the analysis and interpretation of the cumulative findings. The issue of statistical power is an important consideration in this regard, yet sample size justification is an often overlooked consideration among primary research on MT including my past work (e.g., Gucciardi and Jones, 2012). For example, should a study with 16 participants (90% power to detect the smallest possible effect of r = 0.67 at p < 0.05; Cowden et al., 2014) be given the same degree of quality as one with 351 participants (90% power to detect the smallest possible effect of r = 0.171 at p < 0.05; Cowden et al., 2016)? Although the answer depends on the smallest effect size of interest (Lakens and Evers, 2014), it is important to bear in mind that underpowered studies inflate false positives (Button et al., 2013) and effect sizes tend to be unstable when samples are small (Schönbrodt and Perugini, 2013). The process of assessing methodological quality across a heterogeneous set of primary studies is challenging, particularly for observational research (Vandenbroucke et al., 2014; von Elm et al., 2014), because of the unavailability of consensus regarding definitions, assessments, and integration with the synthesis of evidence (Oliveras et al., 2017).

Construct Validity Evidence of Psychometric Tools Matters

Given the predominance of self-reported MT among the primary studies of Lin et al.'s (2017) systematic review, a key consideration for the assessment of methodological quality is the degree of construct validity evidence for each tool. Construct validation refers to the testing of bidirectional associations between theory development and measurement in terms of assessments of the theoretical domain and its operationalization (substantive phase), empirical fidelity of the measurement approach (structural phase), and the meaning of test scores with key correlates or group differentiation (external phase) (Loevinger, 1957). Assuming sufficient evidence exists for the substantive foundations of a measure (e.g., precise definition, content validity evidence), ongoing tests of the internal structure of a scale are a necessary prerequisite for examinations of external relations because the number of latent factors or loading patterns may differ across samples, populations, and settings (Flora and Flake, 2017).

Lin et al.'s (2017) findings showed that the MTQ48 and its shortened version (MTQ18) (Clough et al., 2002) are the most widely used measures for the assessment of MT and its associations with external variables including cognition and educational, work and military performance (11 of 16 studies), psychological well-being (17 of 23 studies), personality and other psychological traits (9 of 16 studies) and genetics (4 of 5 studies). Yet psychometric analyses of the MTQ48 by the original author and his colleagues (Clough et al., 2012; Gerber et al., 2013; Perry et al., 2013, 2015) as well as independent researchers (Gucciardi et al., 2012, 2013; Birch et al., 2017; Vaughan et al., 2017) cast doubts on the operationalization of the 4Cs model of MT via the MTQ48 both in terms of global (i.e., model-data congruence) and local (i.e., pattern of factor loadings) misfit. As such, any conclusions regarding the associations between MT and key correlates are tenuous because of the uncertainty regarding the meaning of the underlying latent factor.

Conclusion

As the first systematic review of the quantitative literature on the associations between MT and key correlates, I commend Lin et al. (2017) for their efforts in bringing together disparate literatures. However, I urge caution to those readers who might interpret their findings uncritically in two key ways. The exclusion of an assessment of the methodological quality of primary studies and the reliance in the literature on a measure of MT with questionable conceptual underpinnings and limited construct validity evidence reduce our confidence in the veracity of the available findings and, therefore, the conclusions and implications of the systematic review for theory and practice. An assessment of the methodological quality of primary studies included within the Lin et al. review (e.g., Mixed Methods Appraisal Tool; Pluye and Hong, 2014) and re-analysis and re-interpretation of the findings represents an important step for the science of MT. Indeed, a critical analysis of the methodological quality of primary work alone can represent a major contribution to a field of research because it might highlight deficiencies and/or strengths in evidence (Moher et al., 2015; Shamseer et al., 2015).

Author Contributions

The author confirms being the sole contributor of this work and approved it for publication.

Funding

DG is supported by a Curtin Research Fellowship.

Conflict of Interest Statement

The author declares that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

References

Birch, P. D. J., Crampton, S., Greenlees, I., Lowry, R., and Coffee, P. (2017). The mental toughness questionnaire-48: a re-examination of factorial validity. Int. J. Sport. Psychol. 48, 331–355. doi: 10.7352/IJSP.2017.48.331

CrossRef Full Text | Google Scholar

Button, K. S., Ioannidis, J. P., Mokrysz, C., Nosek, B. A., Flint, J., Robinson, E. S., et al. (2013). Power failure: why small sample size undermines the reliability of neuroscience. Nat. Rev. Neurosci. 14, 365–376. doi: 10.1038/nrn3475

PubMed Abstract | CrossRef Full Text | Google Scholar

Clough, P., Earle, K., and Sewell, D. (2002). “Mental toughness: the concept and its measurement,” in Solutions in Sport Psychology, ed I. Cockerill (London: Thomson), 32–45.

Clough, P., Earle, K., Perry, J. L., and Crust, L. (2012). Comment on “progressing measurement in mental toughness: a case example of the mental toughness questionnaire 48” by Gucciardi, Hanton, and Mallett (2012). Sport. Exerc. Perform. 1, 283–287. doi: 10.1037/a0029771

CrossRef Full Text | Google Scholar

Connaughton, D., Hanton, S., Jones, G., and Wadey, R. (2008). Mental toughness research: key issues in this area. Int. J. Sport. Psychol. 39, 192–204.

Google Scholar

Cowden, R. G., Anshel, M. H., and Fuller, D. K. (2014). Comparing athletes' and their coaches' perceptions of athletes' mental toughness among elite tennis players. J. Sport. Behav. 37, 221–235.

Google Scholar

Cowden, R. G., Meyer-Weitz, A., and Oppong Asante, K. (2016). Mental toughness in competitive tennis: relationships with resilience and stress. Front. Psychol. 7:320. doi: 10.3389/fpsyg.2016.00320

PubMed Abstract | CrossRef Full Text | Google Scholar

Cowden, R. G. (2017). Mental toughness and success in sport: a review and prospect. Open. Sports Sci. J. 10, 1–14. doi: 10.2174/1875399X01710010001

CrossRef Full Text | Google Scholar

Crust, L. (2008). A review and conceptual re-examination of mental toughness: implications for future researchers. Pers. Indiv. Diff. 45, 576–583. doi: 10.1016/j.paid.2008.07.005

CrossRef Full Text | Google Scholar

Flora, D. B., and Flake, J. K. (2017). The purpose and practice of exploratory and confirmatory factor analysis in psychological research: directions for scale development and validation. Can. J. Beh. Sci. 49, 78–88. doi: 10.1037/cbs0000069

CrossRef Full Text | Google Scholar

Gerber, M., Kalak, N., Lemola, S., Clough, P. J., Perry, J. L., Pühse, U., et al. (2013). Are adolescents with high mental toughness levels more resilience against stress? Stress Health 29, 164–171. doi: 10.1002/smi.2447

CrossRef Full Text | Google Scholar

Gough, D., Oliver, S., and Thomas, J. (2017). An Introduction to Systematic Reviews, 2nd Edn. Los Angeles, CA: Sage.

Google Scholar

Gucciardi, D. F., and Jones, M. I. (2012). Beyond optimal performance: mental toughness profiles and developmental success in adolescent cricketers. J. Sport Exerc. Psychol. 34, 16–36. doi: 10.1123/jsep.34.1.16

PubMed Abstract | CrossRef Full Text | Google Scholar

Gucciardi, D. F., Hanton, S., and Mallett, C. J. (2012). Progressing measurement in mental toughness: a case example of the Mental Toughness Questionnaire 48. Sport Exerc. Perform. 1, 194–214. doi: 10.1037/a0027190

CrossRef Full Text | Google Scholar

Gucciardi, D. F., Hanton, S., and Mallett, C. J. (2013). Progressing measurement in mental toughness: a response to Clough, Earle, Perry, and Crust. Sport Exerc. Perform. 2, 157–172. doi: 10.1037/spy0000002

CrossRef Full Text | Google Scholar

Gucciardi, D. F. (2017). Mental toughness: progress and prospects. Curr. Opin. Psychol. 16, 17–23. doi: 10.1016/j.copsyc.2017.03.010

PubMed Abstract | CrossRef Full Text | Google Scholar

Lakens, D., and Evers, E. (2014). Sailing from the seas of chaos into the corridor of stability: practical recommendations to increase the informational value of studies. Perspect. Psychol. Sci. 9, 278–292. doi: 10.1177/1745691614528520

PubMed Abstract | CrossRef Full Text | Google Scholar

Lin, Y., Mutz, J., Clough, P. J., and Papageorgiou, K. A. (2017). Mental toughness and individual differences in learning, educational and work performance, psychological well-being, and personality: a systematic review. Front. Psychol. 8:1345. doi: 10.3389/fpsyg.2017.01345

PubMed Abstract | CrossRef Full Text | Google Scholar

Loevinger, J. (1957). Objective tests as instruments of psychological theory. Psychol. Rep. 3, 635–694. doi: 10.2466/pr0.1957.3.3.635

CrossRef Full Text | Google Scholar

Moher, D., Shamseer, L., Clarke, M., Ghersi, D., Liberati, A., Petticrew, M., et al. (2015). Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015 statement. Syst. Rev. 4:1. doi: 10.1186/2046-4053-4-1

PubMed Abstract | CrossRef Full Text | Google Scholar

Oliveras, I., Losilla, J.-M., and Vives, J. (2017). Methodological quality is underrated in systematic reviews and meta-analyses in health psychology. J. Clin. Epidemiol. 86, 59–70. doi: 10.1016/j.jclinepi.2017.05.002

PubMed Abstract | CrossRef Full Text | Google Scholar

Perry, J. L., Clough, P. J., Crust, L., Earle, K., and Nicholls, A. R. (2013). Factorial validity of the Mental Toughness Questionnaire-48. Pers. Indiv. Differ. 54, 587–592. doi: 10.1016/j.paid.2012.11.020

CrossRef Full Text | Google Scholar

Perry, J. L., Nicholls, A. R., Clough, P. J., and Crust, L. (2015). Assessing model fit: caveats and recommendations for confirmatory factor analysis and exploratory structural equation modelling. Meas. Phys. Educ. Exerc. Sci. 19, 12–21. doi: 10.1080/1091367X.2014.952370

CrossRef Full Text | Google Scholar

Pluye, P., and Hong, Q. N. (2014). Combining the power of stories and the power of numbers: mixed methods research and mixed studies reviews. Annu. Rev. Public Health 35, 29–45. doi: 10.1146/annurev-publhealth-032013-182440

PubMed Abstract | CrossRef Full Text | Google Scholar

Schönbrodt, F. D., and Perugini, M. (2013). At what sample size do correlations stabilise? J. Res. Pers. 47, 609–612. doi: 10.1016/j.jrp.2013.05.009

CrossRef Full Text | Google Scholar

Shamseer, L., Moher, D., Clarke, M., Ghersi, D., Liberati, A., Petticrew, M., et al. (2015). Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015: elaboration and explanation. BMJ 350:g7647. doi: 10.1136/bmj.g7647

PubMed Abstract | CrossRef Full Text | Google Scholar

Vandenbroucke, J. P., von Elm, E., Altman, D. G., Gøtzsche, P. C., Mulrow, C. D., Pocock, S. J., et al. (2014). The strengthening the reporting of observational studies in epidemiology (STROBE) statement: explanation and elaboration. Int. J. Surg. 12, 1500–1524. doi: 10.1016/j.ijsu.2014.07.014

CrossRef Full Text | Google Scholar

Vaughan, R., Hanna, D., and Breslin, G. (2017). Psychometric properties of the Mental Toughness Questionnaire 48 (MTQ48) in elite, amateur and non-athletes. Sport. Exerc. Perform. doi: 10.1037/spy0000114

CrossRef Full Text | Google Scholar

von Elm, E., Altman, D. G., Egger, M., Pocock, S. J., Gøtzsche, P. C., and Vandenbroucke, J. P. (2014). The strengthening the reporting of observational studies in epidemiology (STROBE) statement: guidelines for reporting observational studies. Int. J. Surg. 12, 1495–1499. doi: 10.1016/j.ijsu.2014.07.013

PubMed Abstract | CrossRef Full Text | Google Scholar

Keywords: mentally tough, psychometrics, factorial validity, methodological quality, construct validity

Citation: Gucciardi DF (2018) Commentary: Mental Toughness and Individual Differences in Learning, Educational and Work Performance, Psychological Well-being, and Personality: A Systematic Review. Front. Psychol. 8:2329. doi: 10.3389/fpsyg.2017.02329

Received: 20 November 2017; Accepted: 21 December 2017;
Published: 09 January 2018.

Edited by:

Maurizio Bertollo, Università degli Studi “G. d'Annunzio” Chieti - Pescara, Italy

Reviewed by:

Mustafa Sarkar, Nottingham Trent University, United Kingdom
Xavier Sanchez, Halmstad University, Sweden, Sweden

Copyright © 2018 Gucciardi. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Daniel F. Gucciardi, daniel.f.gucciardi@gmail.com