Your new experience awaits. Try the new design now and help us make it even better

SYSTEMATIC REVIEW article

Front. Educ., 13 November 2025

Sec. Assessment, Testing and Applied Measurement

Volume 10 - 2025 | https://doi.org/10.3389/feduc.2025.1673500

Measuring persistence and academic resilience of K−12 students: systematic review and operational definitions

  • 1Educational Testing Service, Princeton, NJ, United States
  • 2Brighter Research, Thousand Oaks, CA, United States
  • 3Department of Educational Psychology, University of Illinois Urbana-Champaign, Urbana, IL, United States
  • 4School of Teaching and Learning, University of Florida, Gainesville, FL, United States
  • 5Department of Computer & Information Science & Engineering, University of Florida, Gainesville, FL, United States

Persistence and academic resilience are two non-cognitive skills that are important for students' long-term academic success. To date, there has been a lack of consensus on how to define and measure these critical constructs at the K−12 level. Clear operational definitions and valid measures are essential to assess students' competencies with respect to these two skills and to evaluate how these skills may develop through educational interventions. To address this gap, we conducted a systematic review of 74 studies to synthesize definitions of persistence and academic resilience and evaluate available measures based on evidence of reliability, validity, and fairness. After reviewing the various definitions of persistence and academic resilience, we proposed synthesized definitions. We defined persistence as involving sustained effort toward completion of a goal-directed task despite challenges or difficulties and further broke this down into four components: presence of a goal-directed task, presence of challenges or difficulties, sustained effort, and task completion. Academic resilience was defined as the process of bouncing back or recovering in the face of challenges, adversities, or stressors to achieve successful outcomes (e.g., academic achievement) by using adaptive behaviors or coping strategies over time. Our results revealed wide variation in how existing measures align with these synthesized construct definitions. For persistence, self-report instruments such as the Attitude and Persistence toward STEM Scale and the Academic Self-Efficacy Scale demonstrated strong alignment with all four components. For academic resilience, the Adolescent Resilience Questionnaire and Design My Future Scale encompassed all facets of academic resilience. Behavioral measures were less commonly available, particularly for academic resilience. Additionally, our review revealed that both constructs are empirically linked with other social-emotional competencies (e.g., self-efficacy, self-regulation), suggesting an important avenue for future research and intervention development at the K−12 level. We conclude with recommendations for selecting and adapting measures in K−12 settings and offer suggested directions for future research.

Introduction

Educators, researchers, and policymakers have long been concerned with understanding how students recover from setbacks and persist through academic challenges. This is particularly notable today as students face growing academic demands and systemic inequities remain for many students. Non-cognitive characteristics, including social-emotional skills and personality traits, have emerged as critical factors influencing students' academic success. A growing body of work has demonstrated that these attributes can help students navigate challenges and recover from setbacks in school (Credé and Kuncel, 2008; Duckworth and Yeager, 2015; Farrington et al., 2012; García-Martínez et al., 2022; Molnár and Kocsis, 2024; Noftle and Robins, 2007; Poropat, 2009; West et al., 2016; Yang and Wang, 2022). Persistence and academic resilience are two particularly important non-cognitive constructs linked to students' long-term academic success (Andersson and Bergman, 2011; Caporale-Berkowitz et al., 2022; Farrington et al., 2012; Hattie, 2009; Hunsu et al., 2023; Kälin and Oeri, 2024; Richardson et al., 2012; Yaure et al., 2021).

However, a challenge for research on persistence and academic resilience is the lack of consensus on “gold standard” definitions or measures (Constantin et al., 2011; DiNapoli, 2023; Rudd et al., 2021; Windle, 2010; Windle et al., 2011). While some researchers use these terms interchangeably (e.g., Zhang et al., 2021), others define one construct as a component of the other. For example, persistence has been conceived both as a subcomponent of academic resilience (e.g., Cassidy, 2016) and as an outcome that is influenced by academic resilience (as well as other contextual factors; Joseph et al., 2017); on the other hand, resilience has also been considered a component of persistence (e.g., Skinner et al., 2022, Figure 1). To some degree, this variety in terminology and conceptualization reflects differences in theoretical orientations, from perspectives grounded in motivational theories (Skinner and Pitzer, 2012) to personality psychology (McCrae and Costa, 1987) and beyond. Differences in how these constructs are conceptualized are reflected in how they are measured, with different instruments including different framings and item content aligned to the underlying definitions (Caporale-Berkowitz et al., 2022). Understanding how to define these constructs and how to measure them reliably, fairly, and validly is vital for ensuring that appropriate assessments are used to evaluate students' levels of these non-cognitive skills, as well as for informing interventions supporting their development (Duckworth and Yeager, 2015; West et al., 2018).

Figure 1
Flowchart illustrating the identification, screening, and inclusion process of studies for a review. On the left, studies are identified via databases and registers, starting with 1,296 records, reduced to 747 after removing duplicates. After screening, 47 reports are assessed for eligibility, leading to 74 studies included in the review. On the right, other methods identified 125 records, leading to 124 eligibility assessments and inclusion in the final review. Exclusions are noted at each step, along with specific reasons like language, education context, sample criteria, and validity evidence.

Figure 1. PRISMA diagram of screening process for systematic review.

In this article, we argue that the theoretical heterogeneity apparent in the educational literature underscores the need to better understand how persistence and academic resilience are defined and operationalized and which measures are the most appropriate to use, especially for diverse samples of K−12 students. Clear definitions and valid measures are necessary to enable researchers to design high-quality studies and for practitioners and policymakers to make informed decisions to assess and improve students' persistence and resilience (Farrington et al., 2012). Thus, we conducted a systematic review to synthesize relevant construct definitions and identify the most appropriate measures based on available evidence of validity, reliability, and fairness.

Literature review

The importance of persistence and resilience to student development and academic achievement is widely acknowledged (Farrington et al., 2012; Lee and Shute, 2010; Morrison et al., 2006; Yeager and Dweck, 2012). However, there are a variety of views on how persistence and academic resilience manifest and how they can be measured. In this section, we highlight relevant literature discussing the theoretical foundations of persistence and academic resilience, the reasons why measuring these constructs is critically important, and how the current study advances our understanding of persistence and academic resilience in K−12 education.

Theoretical foundations of persistence and academic resilience

A wide variety of theoretical perspectives speak to the importance of persistence and academic resilience, generally emphasizing learners' capacity to sustain effort and recover from setbacks in pursuit of long-term goals. These frameworks include self-determination theory (SDT; Ryan and Deci, 2000), the Development-in-Sociocultural-Context Model (Wang et al., 2019), and the Complex Social Ecologies of Academic Motivation (Skinner et al., 2022), among many others. While these theories use similar terminology of “persistence” and “resilience” in academic contexts, we often see important, nuanced differences in authors' rationales as to why they are important, how they manifest, or how they should be measured. For example, SDT suggests that students are more likely to persist through challenges when three psychological needs are met: relatedness (e.g., feeling connected to others), competence (e.g., feeling capable of achieving success), and autonomy (e.g., feeling a sense of agency over one's actions; Ryan and Deci, 2000; Deci and Ryan, 2002). These needs serve as the foundation for intrinsic motivation (e.g., inherent enjoyment), which, in turn, supports engagement and the capacity to persist when one encounters challenges or setbacks.

Meanwhile, the Development-in-Sociocultural-Context Model (Wang et al., 2019) and the Complex Social Ecologies of Academic Motivation framework (Skinner et al., 2022; Skinner and Pitzer, 2012) further situate persistence and academic resilience within a broader developmental, social, and ecological system. The Development-in-Sociocultural-Context Model highlights how individual motivational processes are embedded within developmental trajectories shaped by peer, school, and cultural contexts. From this perspective, persistence and resilience are not merely individual characteristics but dynamic processes co-constructed through ongoing interaction with cultural and social factors in the environment. This model emphasizes that support from these factors could serve as critical proximal processes that foster persistence, while stressors in these contexts may undermine resilience (Wang et al., 2019). Importantly, Wang et al. (2019) emphasize the interplay between developmental competencies (e.g., cognitive strategies) and motivational beliefs (e.g., perceived competence) for supporting persistence. For example, a student may have strong problem-solving skills but decide not to persist if they attribute a setback to a lack of ability. Supportive contexts can help strengthen competence beliefs, thereby fostering persistence and resilience. Similarly, the Complex Social Ecologies framework (Skinner et al., 2022; Skinner and Pitzer, 2012) emphasizes that students' motivation unfolds within overlapping “motivational ecologies,” in which classrooms, peers, schools, and cultural environments work together to impact the balance of support and challenge experienced by students. Skinner et al. (2022) show that these dynamics depend on whether students perceive their classrooms as supportive spaces, where teachers provide warmth, clear structure, and meaningful opportunities for autonomy, or as risky spaces, where interactions are marked by coercion, disorder, or peer rejection. Persistence and resilience, in this view, emerge from the ways students draw on available supports and manage the demands in their environments, with adaptive capacities continually shaped by multiple social and cultural systems over time.

Taken together, these frameworks illustrate the importance of persistence and resilience in academic contexts and how they are impacted by developmental, social, and ecological contexts. However, the diverse operationalization among various theoretical frameworks creates challenges for generalization and interpretation. These frameworks thus suggest that measures of persistence and resilience should not only capture students' individual effort but also the ways in which their social and cultural contexts enable or constrain their ability to persist or “bounce back.” Given that students begin to encounter academic challenges and engage in complex problem-solving in elementary school, a clear understanding of how persistence and resilience are defined, measured, and fostered within the K−12 context is particularly crucial.

Measuring persistence and academic resilience in the K−12 context

A growing body of research has explored interventions aimed at promoting persistence and academic resilience in various learning settings. For example, strategies such as productive failure (Kapur, 2008), erroneous examples (Richey et al., 2019), and affect-aware feedback (Grawemeyer et al., 2017) have been shown to help students navigate confusion and frustration, leading to greater learning gains. These interventions recognize that persistence is not simply about students pushing through difficulties but involves strategically managing effort, receiving timely feedback, and developing metacognitive awareness of their learning processes. Importantly, studies across K−12 and higher education show that such interventions can be effective for diverse learners, although the success of these strategies often depends on contextual factors like task difficulty, teacher support, and student characteristics (Lodge et al., 2018).

While persistence and academic resilience are important at all stages of learning, fostering their development is particularly critical in the K−12 context. During early school experiences throughout childhood and adolescence, students are still forming their academic identities (Fallon, 2010), as well as their beliefs about learning, intelligence, and their ability to overcome challenges (Farrington et al., 2012). Confusion and frustration can often occur when students become stuck and are unsure how to proceed or experience moments of failure (Baker et al., 2025), both of which introduce opportunities for students to leverage their persistence and resilience skills to achieve positive outcomes. Students who learn to work through frustration and remain engaged despite setbacks are more likely to succeed in postsecondary education and the workforce (Jones and Kahn, 2017). Therefore, waiting until postsecondary education to address students' persistence and resilience gaps is too late for many (McClelland et al., 2017). Large-scale reviews have emphasized the need to integrate social, emotional, and motivational skill-building early and consistently throughout K−12 schooling (Dusenbury et al., 2015).

Importantly, students encounter structured academic demands and social norms about performance when they enter elementary school. These environments provide opportunities to practice adaptive responses to failure, confusion, and frustration, which are important conditions for cultivating resilience (Tough, 2016). In fact, there have been several interventions at the K−12 level that target promoting persistence and/or resilience (e.g., Irfan Arif and Mirza, 2017; Cook et al., 2019; Gamlem et al., 2019). In general, these interventions have been successful in terms of supporting students' persistence and resilience. However, the ways in which persistence and resilience are conceptualized and measured tend to differ across studies, making it difficult to draw conclusions across interventions (DiCerbo, 2016; Moore and Shute, 2017; Rudd et al., 2021; Tudor and Spray, 2018; Windle et al., 2011). Thus, as researchers continue to strive toward developing interventions targeting persistence and academic resilience, it is imperative that we conceptualize and measure these constructs appropriately.

The present study

Despite their importance, persistence and academic resilience remain complex and variably defined constructs in the literature. Some researchers conceptualize persistence as time-on-task or repeated attempts to solve a problem, while others frame it as a motivational trait involving sustained engagement over time (DiCerbo, 2016; Moore and Shute, 2017). Persistence is sometimes treated as synonymous with constructs like perseverance or grit, while other researchers argue that they are, in fact, distinct in terms of definitions, relevant measures, timescales, and grain size of analyses (see DiNapoli, 2023). Similarly, academic resilience is sometimes defined as the ability to recover from failure in a single learning episode, while other studies treat it as a broader long-term capacity to overcome systemic barriers to success (Bashant, 2014; Tudor and Spray, 2018; Windle, 2010). These varying definitions pose challenges for measurement and intervention design and highlight the need for more precise operationalization and reliable, valid assessments. Addressing this complexity is critical for advancing both theoretical understanding and practical applications in learning environments.

To summarize, despite the largely agreed-upon importance of developing persistence and academic resilience in K−12 education, there is no clear consensus on how persistence and academic resilience are defined and measured. Prior studies often use varying terminology, measure overlapping yet distinct constructs (e.g., grit, engagement, perseverance), and approach their work from a variety of diverse theoretical definitions, making it difficult to synthesize findings or generalize results across age groups. This lack of consistency in construct definition and measurement has implications for both research design and educational practice. To address these gaps, we conducted a systematic review of existing research on persistence and academic resilience in K−12 settings. Specifically, we sought to answer the following research questions:

RQ1. How has persistence been defined and measured in prior research?

RQ2. How has academic resilience been defined and measured in prior research?

Methods

Literature search

We conducted a systematic review and report that our methods generally followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses Checklist (PRISMA; Stevens et al., 2018). Figure 1 illustrates the identification and screening processes that were applied during the systematic review.

On October 25, 2023, we used the following search string to search seven different databases and registers for abstracts (see Figure 1): ((persist* OR resil*) AND (learn*) AND (K-12 OR primary* OR secondary* OR grade OR “elementary” OR “middle” OR “high”) AND (measur* OR instru* OR scale* OR surv* OR assess*) AND (valid*)) NOT (teacher). This process revealed 1,320 potential records for inclusion.

Inclusion and exclusion criteria

In order to be analyzed in this systematic review, studies had to meet the following five inclusion criteria:

1. Be available in English.

2. Be relevant to educational contexts (i.e., the theoretical framing, instruments administered, or study context must connect to educational issues, such as classroom learning, digital learning environments, academic achievement, etc.).

3. Include data sampled from K−12 learners.

4. Include at least one measure of persistence or resilience, including self-report measures, behavioral measures, other-report measures, or observational protocols.

5. Report at least one type of original quantitative evidence of reliability or validity for the measure(s) of persistence or resilience for the given sample.

In terms of reliability and validity evidence, we followed conceptual definitions and examples put forth in the Standards for Educational and Psychological Testing (AERA, NCME, and APA, 2014). Thus, we looked for reported metrics of internal consistency (i.e., Cronbach's α), inter-rater reliability, test-retest reliability, split-half reliability, and other relevant evidence of scale reliability, as well as evidence of content validity, cognitive validity, dimensionality (i.e., internal factor structure), and convergent and divergent validity (i.e., relationships to conceptually related and conceptually distinct constructs). We also examined evidence of fairness (e.g., appropriateness for use with diverse populations based on results from subgroup analyses of relevant samples) and sensitivity to change (e.g., responsiveness to intervention, change over time) where available.

We excluded studies if they did not include a measure of persistence or resilience specifically. Studies measuring persistence were excluded if they only administered the grit scale1 (Duckworth et al., 2007; Duckworth and Quinn, 2009) or general measures of conscientiousness, engagement, behavioral and emotional strengths, self-concept, or interest that did not include a subscale specific to persistence. Studies were also excluded if they only concerned “pipeline persistence,” such as intent to complete a course of study or to pursue higher education, or variables indicating that students completed a course of study or enrolled in college or a particular major. Studies measuring resilience were excluded if they did not administer any self-report or behavioral instruments but rather categorized or labeled students as “resilient” (or not) based on demographic or socioeconomic risk factors or risk factors in conjunction with demonstrated achievement (e.g., students who demonstrate high achievement despite low socioeconomic status or belonging to a racial/ethnic minority group; Waxman et al., 2003).

Study screening

Abstract screening

After removing duplicates, 747 records sourced from databases and registers were screened using ASReview (ASReview LAB Developers, 2023), which supports a more efficient abstract screening process by using machine learning to reorder abstracts in relation to relevance based on a training set (in this case, 2 relevant and 4 irrelevant abstracts) and subsequent human judgments of each abstract as relevant (or irrelevant). Campos et al. (2024) examined what percentage of abstracts needed to be screened to locate 95% of the studies that were later accepted in educational psychology systematic reviews. Campos et al. found that screening 60% of studies should be sufficient to locate 95% of studies that would be accepted in the sample, using logistic regression and Sentence BERT (see Reimers and Gurevych, 2019). Accordingly, we used Sentence BERT for feature extraction and logistic regression as the classification method, along with “maximum” as the query approach and dynamic resampling. We then screened 580 (77.6%) of the 747 records, terminating screening when there were over 250 consecutive irrelevant records reviewed. This screening process resulted in 62 records identified as relevant, for which full texts were requested.

Full-text screening

In addition to the 62 studies identified through abstract screening, we also conducted additional searches to locate more relevant studies. As shown in Figure 1, informal literature searches, studies shared by colleagues, and citation searching from other articles (including existing reviews) produced an additional 125 studies. We requested full texts of these 125 studies (one could not be retrieved) and the 62 studies identified from abstract screening (15 could not be retrieved). Altogether, 171 studies were successfully retrieved and subjected to full-text screening based on the inclusion/exclusion criteria described previously. The reasons studies were excluded are detailed in Figure 1.

Ultimately, 74 studies met the criteria for inclusion in the systematic review following this screening process, with 16 studies sourced from abstract screening and 58 studies identified through other means.

Inter-rater reliability

Two coders participated in the full-text screening process. The coders jointly screened two papers, then worked independently to screen six papers, resolving discrepancies through discussion and qualifying the inclusion/exclusion criteria as edge cases emerged during the coding. Additional rounds of independent coding of six papers and resolving disagreements were conducted until 32 (approximately 20–25%) papers were double-coded for inclusion/exclusion. The two coders then independently screened 10 (approximately 10%) additional full texts, achieving high inter-rater reliability for inclusion/exclusion (100% agreement; κ = 1.00) with five papers included and five excluded. Following reliability coding, the remaining full texts were divided among the two researchers for independent screening. The coders met as needed to review select cases and establish agreement on coding for inclusion/exclusion.

Data extraction

Data were extracted from studies in two phases. First, operational definitions were extracted for each construct by taking direct quotations from the studies and noting the relevant page number(s) for these definitions. If no explicit definition was stated, this was noted.

Next, details on available measures (i.e., name, purpose and use case, type of measure, structure of measure) were extracted, first for persistence (RQ1) and then for resilience (RQ2). We considered purpose and use case, as well as available evidence of reliability, validity, and fairness (AERA et al., 2014), to evaluate the technical quality of the measures. Thus, we also extracted information about the samples tested, subgroup comparisons evaluated, and available reliability and validity evidence reported in each study to aid in our analysis of the instruments. See Table 1 for details on the categories of information extracted for each measure.

Table 1
www.frontiersin.org

Table 1. Coding categories for data extraction.

Two coders participated in data extraction, each extracting data from approximately half the studies. The coders first met and co-coded a few studies to ensure consistency in interpreting the extraction categories (see Table 1). After a trial run of data extraction, the entire team met to review the data extracted thus far and discuss common challenges encountered when extracting the data. After discussion, the two coders continued data extraction and would meet periodically to discuss common challenges and ensure consistency across studies. Due to the iterative nature of this process, we did not formally calculate inter-rater reliability for data extraction.

Data availability

All data used in the analyses reported in this paper are provided in the Results section or in the Supplemental material.

Results

We located 74 papers that contained measures of persistence or resilience. Table 2 reports the distribution of the papers by the construct(s) and the type of measure(s) (self-report, behavioral, other, or multiple types) reported. Most of the papers reported on self-report (i.e., survey-based) measures of resilience (n = 40), followed by self-report persistence measures (n = 14). Relatively few behavioral measures (e.g., in-task actions, timing variables, or other behavioral observations) were reported, and most of these measured persistence (n = 6) or both constructs (n = 1). Five papers included measures that did not fall into self-report or behavioral categories, including teacher-report, parent-report, or observation protocols (i.e., coding schemes applied by researchers). Specifically, two papers reported on teacher-report instruments of persistence, one reported on a teacher-report instrument of resilience, and two papers reported on observation protocols for persistence. Altogether, six papers measured both constructs, but only one paper included both self-report and behavioral measures of persistence in the same study (Ventura and Shute, 2013), and two papers included both self-report and other-report instruments of resilience (Nese et al., 2012; Yang, 2014).

Table 2
www.frontiersin.org

Table 2. Distribution of papers by measure type and construct.

RQ1. How has persistence been defined and measured in prior research?

Defining persistence

We reviewed construct definitions cited for persistence (and related constructs) as well as the associated theories or theoretical frameworks authors cited to inform their perspectives on this construct (see Table 3). Of 31 papers, six did not include an explicit definition for persistence, and six did not cite a relevant theoretical framework for persistence (two did not include either a definition or a theoretical framework). Of the remaining 25 papers that did cite particular theoretical frameworks, the theories varied, and some papers cited multiple frameworks. The most commonly cited theories were related to motivation and engagement, which characterize persistence as an adaptive state reflecting engaged, effortful behavior (e.g., Martin's Motivation and Engagement Wheel; seven papers), and personality factors, which characterize persistence as a facet of conscientiousness reflecting a disposition to persist (e.g., Big Five; seven papers). Other theories cited included social-emotional learning (three papers), self-regulated learning (three papers), and self-efficacy (two papers). Social cognitive theory, mastery behaviors, executive functioning, cognitive load, and several other frameworks (e.g., twenty-first century skills) were each cited once among the reviewed papers.

Table 3
www.frontiersin.org

Table 3. Definitions of persistence in the literature review.

Despite diversity in their theoretical bases, examination of the persistence definitions did highlight some common themes, such as continuation of effort on a goal-directed task in the face of challenge or difficulty (Chichekian and Vallerand, 2022; DiCerbo, 2014; Martin, 2007, 2009; Porter et al., 2020), working until tasks are completed (OECD, 2021a; Tan et al., 2014; Ventura and Shute, 2013), and unlimited attempts or time allotted to work on difficult (or unsolvable) problems (see Feather, 1962, cited in Lufi and Cohen, 1987; Rahimi et al., 2021). Persistence is alternately characterized as equivalent to effort, engagement, resilience, perseverance, and grit; as a facet of effort, engagement, resilience, goal-directed behavior, conscientiousness, or grit; or as a multidimensional construct (flexible vs. rigid, productive vs. unproductive). We return to the issue of relationships among persistence and related motivational constructs in the Discussion.

Taken together, we propose a synthesized definition of (task) persistence as involving sustained effort toward completion of a goal-directed task despite challenges or difficulties. Persistence may especially be observed when students are faced with the opportunity to complete multiple tasks or to attempt the task or subtask multiple times (e.g., revise their work). The obstacles faced may come from the difficulty (or solvability) of the problems themselves, from external forces (“opposed by other people”; Fang et al., 2017), or internal states (e.g., “inclination toward the task”; Kai et al., 2018). This definition captures the predominant, recurring themes across the available definitions of persistence presented in Table 3.

Review of persistence measures

We reviewed measures of persistence described in 31 papers; one paper (Ventura and Shute, 2013) included three measures of persistence (two behavioral, one self-report), and some papers used common measures. In total, we reviewed evidence for 28 distinct measures.

We summarized details and reliability evidence for each measure by type, including behavioral measures (n = 9; Supplementary Table 1a), self-report measures (n = 16; Supplementary Table 2a), and other measures (i.e., one teacher-report instrument and two observational measures; Supplementary Table 3a). We also summarize validity evidence reported for each measure in Supplementary Tables 1b3b. Within each table, measures are sorted roughly by age/grade level, with consideration of availability for use as well as available reliability and validity evidence.

Behavioral measures of persistence

Our review included eight papers that described a total of nine approaches to measuring persistence, which constituted behavioral measures derived from students' direct interactions with digital tasks (e.g., time-based measures, other actions or behaviors, as derived from digital log files or clickstream data captured during interactions). In terms of the availability of this group of measures (Supplementary Table 1a), many are readily implemented or adapted to other digital task contexts. Measures spanned a wide range of grade levels from first to twelfth grade, with some measures evaluated with middle school students, some with high school students, and several with a combination of elementary, middle, and high school students. Additionally, six of the nine measures were designed for use in domain-specific contexts (i.e., mathematics, science, or physics problem solving), while three were domain-general measures. While reliability evidence indicated adequate reliability when it was reported, most measures (six out of nine) did not report reliability evidence; thus, to the extent possible, future research should try to compute and report on measures of reliability (e.g., internal consistency, inter-rater reliability, test-retest reliability, split-half reliability) to ensure adequate technical quality of the behavioral indicators. Despite limited reliability evidence, all measures in this category reported some form of validity evidence in the studies reviewed (Supplementary Table 1b).

The most straightforward behavioral measure to apply is DiCerbo's (2014) behavioral indicators of task persistence (i.e., time on task and number of task events). This measure is readily available for use in digital learning tasks where time and the number of events (actions) can be tracked through log files. The measures showed adequate reliability (α's > 0.80) and evidence of sensitivity to students' development (i.e., scores increased by grade level), and validity evidence showed good fit for models based on this measure. The domain-general nature of this measure and its use across a wide range of ages (grades 1 through 9) imply that it can readily be adapted for use in various interactive task contexts, so long as those tasks include multiple events nested within multiple levels, units, etc. Measures created by Tan et al. (2014) in the context of the ASSISTments mathematics learning platform were also highly reliable (α's > 0.90) and could be promising for many research teams but would likely require some adaptation to the specifics of the target learning task if applied outside of ASSISTments.

Other behavioral measures developed in domain-specific contexts [i.e., Boe et al., 2002; Fang et al., 2017; Kai et al., 2018; Rahimi et al., 2021; Game-based Assessment of Persistence (GAP), Ventura and Shute, 2013] did not report reliability evidence and therefore should be used with caution. If these measures are selected for use, reliability evidence should be collected and analyzed to aid in the interpretation of results. Porter et al. (2020) also did not report reliability for the domain-general Persistence, Effort, Resilience, and Challenge-Seeking (PERC) task or its subscores, likely due to the small number of observations per subscore, although some validity evidence for the overall task and subscores is reported. In the PERC task, persistence was operationalized as total time spent on four difficult puzzles; this time-based measure was converted into a measure ranging from 0 to 1 to enable summation across subscores to create an overall PERC score. Implementing these tasks would require researchers to program their own versions of the tasks and to apply the specified sample-dependent scoring approaches; available evidence is not sufficient to support implementing the persistence measure alone. Researchers choosing to implement this measure will need to expend additional effort to prepare the tasks for administration, for scoring, and for validation, and therefore may prefer to use other measures not tied to this specific set of puzzle-solving tasks. The final measure in Supplementary Table 1a—the Performance Measure of Persistence (PMP; Ventura and Shute, 2013)—was sufficiently reliable but is not readily available for use and therefore also not recommended.

Among the behavioral measures of persistence, we observed that only the GAP and PMP (Ventura and Shute, 2013) were evaluated against an external survey-based measure of persistence, analyses of which indicated that the behavioral measures and survey measures were uncorrelated (although use of the PMP is not recommended because the items are not readily available). Porter et al. (2020) evaluated the PERC task against a self-report measure of mastery behaviors, but there were not sufficient items within the self-report survey to isolate a persistence subscore. Future research utilizing behavioral measures of persistence should also include relevant and robust self-report survey-based measures of persistence to examine the extent to which behavioral measures assess a similar or different construct from well-established surveys. We evaluate such survey-based measures in the following section.

Self-report measures of persistence

Our review included 20 papers that described a total of 16 distinct self-report measures of persistence appropriate for K−12 students (Supplementary Table 2a); some measures were included in two or more studies and are therefore grouped together in the table. When a measure was adapted for a different subject context (i.e., a domain-general instrument that was contextualized for a particular domain; e.g., Green et al., 2007) or was shortened from its original form (e.g., Martin et al., 2021), we considered it a distinct instrument.

Notably, almost all of the self-report measures of persistence reflect a subscale within a larger scale measuring some aspect of motivation, engagement, attitude, social-emotional skills, etc. Depending on additional constructs covered, researchers may opt to use the persistence subscales alone, in conjunction with additional relevant subscales, or the original scale with all subscales. Measures spanned a wide range of ages and grade levels, from first grade through twelfth grade. Seven of the measures were designed for use in domain-specific contexts, such as mathematics, science, or English. All the measures reviewed reported some form of reliability evidence, and most reported adequate reliability; however, four of the persistence measures are not sufficiently reliable to recommend subsequent use (i.e., α < 0.70; Chichekian and Vallerand, 2022; Esen-Aygun and Sahin-Taskin, 2017; Luby et al., 1999; Lufi and Cohen, 1987). All self-report measures reported some form of validity evidence in the studies reviewed (Supplementary Table 2b).

Altogether, seven measures (reported in eight sources) are readily available for use in subsequent research because the full text of each item is available in the reviewed publication or another source cited within that publication. A further three measures are available in either the reviewed publication or in the International Personality Item Pool (IPIP; Goldberg et al., 2006); however, due to reported adaptations or dropping items for subsequent analysis (e.g., due to low item-total correlation), it is unclear precisely which items should be included in the final scale (Luby et al., 1999; Miller et al., 1996; Ventura and Shute, 2013). Therefore, we do not recommend using these scales at this time due to the degree of inference required about the relevant items. The final nine papers report on survey instruments that are not readily available, either because the instrument is not available in English (Paulino et al., 2016) or because the scale is proprietary and only available at a cost, such as the Motivation and Engagement Scale-High School (MES-HS; Martin, 2007, 2009; Martin et al., 2021; see also Green et al., 2007) and the Student Motivation Scale (Martin and Marsh, 2003, 2006).

Of the seven measures that are readily available, five report adequate reliability evidence for the persistence measure or subscale and are therefore recommended for use in future research. Among these five measures, the number of items ranges from 6 to 15, suggesting that these instruments will be quick to administer. Three of the five instruments were contextualized within a specific content domain (i.e., STEM, mathematics, English). In terms of validity evidence (Supplementary Table 2b), these measures generally showed the expected factor structure, and most showed expected correlations to other motivation and engagement variables (e.g., self-efficacy, academic performance). To aid readers, we group these instruments by the age ranges for which validity and reliability evidence has been reported.

Elementary school and up

The Survey on Social and Emotional Skills (SSES; OECD, 2021a,b) is a domain-general instrument that captures social and emotional skills aligned with the Big Five personality domains. The persistence subscale consists of a seven-item scale (after dropping one poorly performing item) that offers a domain-general self-report measure of the general tendency to persist overall. The persistence scale had adequate reliability (α > 0.70 across all cities) and the overall instrument underwent rigorous validation, including multigroup confirmatory factor analyses to examine measurement invariance by age, gender, and sample location, as well as correlations with external measures of other social-emotional variables and student outcomes. Results for persistence indicated scalar invariance by age and gender (suggesting that scale scores can be directly compared across groups) and metric invariance by cities (indicating that within-group associations among variables can be compared across groups, but scale scores cannot directly be compared across cultures). The SSES was specifically designed for international, cross-cultural use with large and diverse samples of 10- and 15-year-olds (approximately 5th and 10th grade) across the globe and, accounting for metric invariance, seems appropriate for use within diverse elementary to high school contexts.

The Attitude and Persistence toward STEM Scale (APT-STEM; Sunny, 2018) is a 24-item scale with subscales for positive STEM attitudes (16 items) and persistence to succeed in STEM (eight items). This measure was highly reliable with a sample of students from 5th to 12th grade (α > 0.80). While confirmatory factor analysis indicated the two-factor structure had acceptable fit, the APT-STEM was not evaluated with convergent or divergent constructs (e.g., concurrent or predictive validity), so if this measure is selected, it is recommended to couple it with external measures to provide further validity evidence.

Middle school and up

The Academic Self-Efficacy Scale (ASES; Dullas, 2018) is a domain-general instrument that captures reports of the general tendency to persist in academics, evaluated for use with students in grades 7–10 in the Philippines. It consists of 62 items tapping four dimensions of academic self-efficacy: perceived control (12 items), competence (15 items), persistence (15 items), and self-regulated learning (20 items). The persistence subscale was highly reliable (α = 0.90), and the overall scale showed correlations in expected directions with external measures of self-efficacy and academic performance. The Self-Motivated Learning Inventory (SMLI; Bae, 2014) is a 43-item instrument developed to measure adolescents' characteristics relevant to motivation and self-regulated learning within the domains of English or Mathematics, based on several existing scales like the MSLQ (Pintrich et al., 1991). There are six subscales: self-efficacy (nine items), mastery goal (four items), performance avoidance goal (four items), effort and persistence (seven items), cognitive and metacognitive strategy use (12 items), and resource management (seven items). SMLI was evaluated with a sample of middle and high school students from South Korea, and the effort and persistence subscale was highly reliable (α = 0.89) for both math and English domains. The effort and persistence subscale also showed moderate correlations with achievement in both English and Mathematics.

High school

Finally, the vigor subscale of the Engagement Scale was evaluated for use with two samples of Fijian students in years 11 and 12 from multi-ethnic, metropolitan schools (Phan, 2016). The 17-item Engagement Scale, originally developed for use with undergraduate and adult populations (Schaufeli et al., 2002a,b), included three facets of engagement: absorption (six items), dedication (five items), and vigor (six items). Vigor was described as reflecting students' persistence and resilience (e.g., “As far as my studies in mathematics are concerned I always persevere, even when things do not go well”; Phan, 2016). The vigor subscale was highly reliable (α > 0.85) and showed significant correlations with the other subscales in expected directions, and CFA analyses showed good model fit.

Other measures of persistence

Our review included four studies that described approaches to measuring persistence that did not fall into the behavioral or self-report categories. Specifically, we identified two studies that included teacher-report surveys of student behavior and two studies involving behavioral coding schemes to be applied by trained researchers to analyze behavioral data captured through coding of video data or other verbal protocols (Supplementary Table 3a). Each of the studies reported reliability evidence. Three of the four studies reported some validity evidence (Supplementary Table 3b). For this group of measures, all are readily available, and each instrument has reported adequate reliability.

If researchers are interested in collecting teacher reports, the Learning Behaviors Scale (LBS) is recommended for use as a domain-general measure (see McDermott, 1999, for the original instrument). This teacher-report instrument was designed for students up through grade 12, and based on validity evidence summarized here, is appropriate for use with lower elementary (K-2) and elementary-to-middle school students (grades 1–7). The persistence subscale is reliable (α > 0.80), and available validity evidence indicates that the persistence subscale is distinct from three other factors within an overall four-factor structure, although we note that the two studies in our review (Canivez et al., 2006; Rikoon et al., 2012) used somewhat different labels for the four factors (see Supplementary Table 3a).

If researchers are coding student behaviors that are indicative of persistence, either the 3 Phase Perseverance (3PP) framework (DiNapoli and Miller, 2022) or the Collaborative Computing Observation Instrument (C-COI; Israel et al., 2016) is recommended, depending on purpose and grade level. The 3PP framework may only be suitable for use with high school students studying mathematics, and additional validation is required with other ages, grade levels, and subjects. If focusing on collaboration among elementary students, especially in a computer science context, the C-COI may be more appropriate. However, note that Israel et al. (2016) only report percent agreement and do not report validity evidence for the C-COI (Supplementary Table 3b). Thus, researchers are advised to use this measure with caution and to collect, evaluate, and report more robust reliability and validity evidence in their studies in order to contribute to the available evidence for this measure.

RQ2. How has academic resilience been defined and measured in prior research?

Defining resilience

We reviewed construct definitions cited for resilience (and related constructs) as well as the associated theories or theoretical frameworks authors cited to inform their perspectives on this construct (see Table 4). Of the 49 papers, four did not include an explicit definition for resilience, and 10 did not cite a relevant theoretical framework for resilience (with no overlap between these papers). Of the remaining 39 papers that did cite particular theoretical frameworks related to resilience, the theories varied, and some papers cited multiple frameworks. Notably, some of these theories overlapped with those cited for persistence (e.g., theories related to motivation and engagement, social-emotional learning, self-regulated learning, self-efficacy, and social cognitive theory). The most commonly cited theories were related to motivation and engagement, which characterize resilience as an adaptive state and energetic resource allowing students to cope with and recover from challenging tasks (e.g., Martin's Motivation and Engagement Wheel and Skinner and Pitzer's model of motivational resilience; eight papers), social-emotional learning frameworks, which characterize resilience as adapting to hardship or resistance to stress (e.g., CASEL's 2020 Social and emotional learning framework, OECD's Big Five dimensions of social and emotional skills framework; six papers), and social-ecological theories, which consider how individuals use internal and external resources to cope with adversity through a process of interactions within their environment and context (e.g., Ungar's social-ecological interpretation of resilience, Bronfenbrenner's ecological model; six papers). Other theories cited included developmental approaches (e.g., positive youth development; six papers), self-determination theory (four papers), social cognitive theory (three papers), and theories of intelligence (i.e., growth mindset; three papers). Additional theories such as academic buoyancy, self-efficacy, self-regulated learning, cognitive appraisal of resiliency, strength-based assessment, and personality theory were each mentioned in two or fewer papers.

Table 4
www.frontiersin.org

Table 4. Definitions of resilience in literature review.

As with persistence, we observed considerable theoretical diversity among approaches to resilience. However, several common themes emerged, most notably the idea of overcoming or experiencing successful outcomes despite adversity (e.g., Anderson et al., 2020; Cui et al., 2023; Liebenberg et al., 2012; Ricketts et al., 2017), adapting to or effectively dealing with stress or setbacks (e.g., Anghel, 2015; Burger et al., 2012; Donnon and Hammond, 2007; Jew et al., 1999; Rajan et al., 2017), coping with challenges (e.g., Green et al., 2021; Skinner et al., 2013; Victor-Aigboidion et al., 2018), and recovery after failure (e.g., Porter et al., 2020; Sari et al., 2022; Zulfikar et al., 2020). Views of academic resilience specifically frame the construct in terms of academic successes and accomplishments (e.g., Liu and Platow, 2020; Mallick and Kaur, 2016) and/or setbacks and challenges specific to academic contexts (e.g., “poor grades, competing deadlines, exam pressure, difficult schoolwork”; Martin and Marsh, 2003, 2006, 2008; Njoki, 2018). These adversities may range from “everyday” academic stressors to more acute or chronic issues that may threaten students' educational development (see discussions of academic buoyancy vs. academic resilience in Martin, 2013; Martin and Marsh, 2008).

Some views frame resilience as an individual capacity or ability (e.g., Porter et al., 2020; Sarwar et al., 2010; Susanto et al., 2023; Zulfikar et al., 2020), while social/ecological perspectives frame resilience as related to how individuals navigate internal and external resources in the context of environmental adversities (e.g., Gartland et al., 2011; Ungar and Liebenberg, 2011). In these perspectives, resilience can be considered an interaction among risks and protective factors (or assets) both internal and external to an individual (Furlong et al., 2009; Gartland et al., 2011; Sun and Stewart, 2007). Resilience is alternately characterized as equivalent to persistence (i.e., “the capacity of students to persevere”; Liu and Platow, 2020, p. 240), as a facilitator of persistence (i.e., “resilience… and coping… have emerged as important facilitators of adjustment and perseverance”; Burger et al., 2012, p. 371), or as a higher-level construct than persistence (e.g., Martin and Marsh, 2003, 2006).

Taken together, we propose a synthesized definition of (academic) resilience as involving the process of bouncing back or recovering in the face of challenges, adversities, or stressors to achieve successful outcomes (e.g., academic achievement) by using adaptive behaviors or coping strategies over time. Resilience can be considered a dynamic process (vs. an ability or inherent capacity) involving interactions among an individual's internal assets, their perceptions of task challenges, and the qualities of environmental protective factors and stressors or other internal or external adversities they may face. These adversities may be relatively minor (e.g., everyday academic setbacks, internal motivational issues) or more major (e.g., imminent failure or disaffection from schooling). As with persistence, this definition reflects the major recurring themes and ideas found in the resilience definitions summarized in Table 4.

Review of resilience measures

We reviewed measures of resilience described in the 49 papers retained for the final review; several papers included multiple resilience measures (i.e., Cui et al., 2023; Nese et al., 2012; Yang, 2014), and some papers used common measures, so in total we reviewed evidence for 46 distinct measures of resilience. We note that in two papers (Mallick and Kaur, 2016; Victor-Aigboidion et al., 2018), it was unclear based on the authors' descriptions exactly what measure was administered, and it was not possible to determine the origin of the measure or overlap with other instruments; thus, we treated these two papers as reporting distinct measures. As with persistence, we considered instruments that were adapted to a different domain, shortened, or combined with other instruments to be distinct measures.

We summarized details and reliability evidence for each measure by type, including behavioral measures (n = 1; Supplementary Table 4a), self-report measures (n = 41; Supplementary Table 5a), and other-report measures (three teacher-report and one parent-report surveys; Supplementary Table 6a). We also summarize validity evidence reported for each measure in Supplementary Tables 4b6b. Within each table, measures are sorted roughly by age/grade level, with consideration of availability for use as well as available reliability and validity evidence.

Behavioral measures of resilience

Our review included only one paper describing a behavioral approach to measuring resilience, derived from students' direct interactions with digital tasks (Supplementary Table 4a). The PERC tasks (Porter et al., 2020) included a measure of resilience, which was operationalized as the percentage of correct responses on three easy puzzles presented after working on four difficult puzzles (i.e., whether students' performance reflects their ability to “bounce back” after experiencing difficult tasks). Porter et al. (2020) do not report reliability for the PERC tasks or their subscores, likely due to the small number of items included; although validity evidence is reported for the subscore (Supplementary Table 4b), there is limited evidence that the resilience items can be administered separately from the entire task. As mentioned when discussing the persistence subscore, administration of PERC would require researchers to program their own version of the tasks. If researchers wish to use these measures with appropriate justification, reliability and validity evidence should be collected and analyzed to aid in the interpretation of results, especially if any modifications are made to the instrument (e.g., administering difficult and then easy tasks to measure persistence and resilience without the other two subcomponents). Given the limitations of PERC, behavioral measures of resilience may be a promising area for future research and instrument development.

Self-report measures of resilience

We identified 41 distinct self-report measures of resilience appropriate for K−12 students (Supplementary Table 5a). Our review indicates that there is a wide variety of self-report resilience measures, with the majority of instruments readily available for use (28 instruments, approximately 68%). Three studies reported on scales for which items are available, but it is either unclear which subset of items was administered (Gartland et al., 2011; Sari et al., 2022) or the items must be requested from the study author (Jew et al., 1999), making it difficult to recommend using these scales in subsequent research due to the intact instrument not being readily available. Another 12 papers reported on survey instruments that were not readily available, either because the instrument is not available in English (e.g., Wei et al., 2023), available at a cost (e.g., the Social Emotional Assets and Resiliency Scales [SEARS]; Cohn et al., 2009; Nese et al., 2012), or only a few example items are reported with no further details on accessing the complete instrument (e.g., Burger et al., 2012; Skinner et al., 2013; Thornton et al., 2006). Two papers did not report clear details on the measures (Mallick and Kaur, 2016; Victor-Aigboidion et al., 2018). We focus on instruments that are readily and freely available (i.e., at no cost) for research use.

The measures reflect a mix of standalone resilience scales and resilience subscales reflecting one facet of larger motivational or social-emotional constructs. Measures spanned a wide range of ages and grade levels, from third grade through twelfth grade and into early adulthood. Seven of the self-report measures were designed (or adapted) for use in domain-specific contexts (e.g., mathematics, programming, English as a foreign language). Reliability evidence was not reported for three traditional self-report measures (Sari et al., 2022; Susanto et al., 2023; Victor-Aigboidion et al., 2018) or for the situational judgment tests administered by Yang (2014). While most of the studies that did report reliability evidence had adequate reliability, four of the resilience measures demonstrated inadequate reliability with the students sampled (i.e., α < 0.70; Fang et al., 2020; elementary school sample from Hanson and Kim, 2007; Rajan et al., 2017; Zulfikar et al., 2020). Notably, the Resilience Youth Development Module (RYDM) did not show adequate reliability with one sample of elementary school students (Hanson and Kim, 2007) but did with another elementary sample (Sun and Stewart, 2007) and with secondary school samples (Furlong et al., 2009; Hanson and Kim, 2007), and Martin and Marsh's Academic Resilience Scale did not show adequate reliability in one study (Rajan et al., 2017) but did in all other studies using this instrument (Anghel, 2015; Cui et al., 2023; Kapikiran, 2012; Martin and Marsh, 2003, 2006; Njoki, 2018). Some form of validity evidence was reported for all self-report measures of resilience (Supplementary Table 5b).

Altogether, there are 25 resilience measures (including adaptations) that are readily available and report adequate reliability evidence. To aid readers, we group these instruments by the age ranges for which validity and reliability evidence has been reported.

Elementary school and up

The SSES (OECD, 2021a,b) includes a short six-item measure (after dropping two poorly performing items) that offers a domain-general measure of the emotional aspects of resilience for 10-year-olds (approximately 5th grade) and 15-year-olds (approximately 10th grade), which could be administered in conjunction with the corresponding persistence scale from the same instrument. While generally the resilience (stress resistance) subscale was reliable (α = 0.80 for the full sample), values in four cities (i.e., Bogota, Helsinki, Houston, and Manizales) were inadequate (α < 0.70; OECD, 2021a), indicating differential reliability across cultures. As with persistence, results of measurement invariance tests indicated scalar invariance by age and gender (allowing direct comparison of scale scores across groups) and metric invariance by cities (allowing comparison of within-group associations across groups, but not direct comparisons of scale means); thus, this measure seems appropriate for use with diverse samples, taking metric invariance into account.

The nine-item Academic Resilience for Programming (ARP; Uslu, 2023), an adaptation of the Academic Resilience in Mathematics (ARM; Ricketts, 2015; Ricketts et al., 2017) measure, was successfully used in the context of computer programming with both 5th and 6th grade students in Turkey and had good reliability (α > 0.80). The 21-item elementary school survey of the RYDM (Hanson and Kim, 2007) was designed as a more comprehensive instrument (measuring multiple facets of internal resilience assets and protective factors), but reliability and validity evidence shows some potential challenges with the subscales designed for administration with elementary-level students, perhaps due to the small number of items tapping each internal resilience asset (Hanson and Kim, 2007). Specifically, Hanson and Kim (2007) found that only a few items showed appropriate factor loadings, and these subscales were not sufficiently reliable at the elementary school level (α's < 0.70). Sun and Stewart's (2007) administration of 34 California Healthy Kids Survey (CHKS) items was highly reliable (α > 0.90) and had good fit based on a confirmatory factor analysis; the factor analysis also included items from an additional scale (13 items from the Peer Support Scale from the Perceptions of Peer Support measure; Ladd et al., 1996) as an additional factor. Researchers may not have the flexibility to administer all 47 of these items, and available data do not address whether administering separate subscales would be appropriate. Thus, the use of this more comprehensive resilience measure at the primary grades is advised with caution, with the caveat that researchers will need to collect their own reliability and validity evidence to investigate whether administrations of smaller subsets of items are advisable.

Middle school and up

Among the eight instruments for middle school and older students, a wide range of resilience constructs are represented and appear to be measurable with sufficient reliability and validity evidence. While the CHKS resilience measure showed some low reliabilities at the elementary level, the 51-item secondary school survey appeared to perform well with samples of 7th-, 9th-, and 11th-grade students (α > 0.70; Furlong et al., 2009; Hanson and Kim, 2007) and would thus be suitable for measuring students' internal resilience assets (i.e., self-efficacy, empathy, problem-solving, and self-awareness) for middle school and older students. The Adolescent Resilience Questionnaire (ARQ; Gartland et al., 2011) is a multidimensional, 88-item survey developed as a comprehensive measure of resilience that taps both individual and environmental factors. Anderson et al. (2020) reported on a reduced 49-item version of the ARQ validated for a diverse sample of middle and high school students in the U.S., which showed good model fit and strong correlations with the original instrument. The Child and Youth Resilience Measure (CYRM-28) was originally designed as a 58-item cross-cultural measure of resilience processes with individual, relational, community, and cultural resources; while all but the relational dimension were sufficiently reliable, factor analysis indicated that these dimensions could not be recovered empirically, and indeed, the factor structure varied across subgroups of minority world girls and boys, majority world girls, majority world boys with high social cohesion, and majority world boys with low social cohesion (Ungar and Liebenberg, 2011). The 28-item version (Liebenberg et al., 2012) reflects individual, relational, and contextual factors (retaining items that were theoretically important and/or showed the strongest factor loadings across the four subgroups to ensure cross-cultural relevance) and showed high reliability (α > 0.70; test-retest 0.65–0.91) and good fit with a three-factor EFA for diverse Canadian youth. It is unclear how the reduced version functions with non-Western populations.

The 36-item self-rating Resilience Scale used in the Mission Skills Assessment (Yang, 2014) was highly reliable, and three-factor CFA models (efficacy, emotional stability, and resilience) showed good fit across multiple waves of data from large samples of middle school students. This instrument also showed statistically significant relationships to external academic measures (e.g., GPA, standardized assessment scores) and life outcomes (i.e., life satisfaction). The adaptations of (Cassidy 2015) Academic Resilience Scale, which was designed for higher education populations, also showed adequate reliability with middle school-aged students (ages 13–14) up to high school across multiple countries (Thailand, China, and Iran; Buathong, 2019; Cui et al., 2023; Ramezanpour et al., 2019). Note that Buathong (2019) used a 16-item variation of the scale following exploratory factor analysis, retaining three factors of emotional response, perseverance, and reflecting and adaptive help-seeking. Cui et al. (2023) used 18 items translated from the original scale, plus some item adaptations, for a 20-item scale measuring four factors of perseverance, self-reflection, adaptive help-seeking, and negative affect and emotional response; all factors were sufficiently reliable, and a four-factor CFA showed adequate fit with the revised four-factor structure.

The Academic Buoyancy Scale (ABS; Martin and Marsh, 2008; Martin, 2013) is an efficient four-item measure capturing students' “everyday resilience” and ability to deal with setbacks, challenges, or pressures that occur during everyday academic experiences. This measure showed adequate internal consistency (α > 0.70) and test-retest reliability (r = 0.67) with large samples of Australian middle and high school students. Martin and Marsh (2008) demonstrated that the measure can effectively be contextualized within a particular subject (i.e., mathematics), suggesting that this instrument can not only be used with a wide range of students (ages 11–19) but also potentially can be used in specific subject matter domains in addition to general academics. Martin (2013) also introduced the Academic Risk and Resilience Scale (ARRS) to capture specific instances of major academic adversity students might have experienced within the past year; if individuals respond “yes” to any of these major adversities, they are asked to respond to four items that parallel the ABS items in that they ask about students' ability to deal with that specific type of setback. ARRS showed high reliability (α = 0.90), and factor analyses showed good fit with academic buoyancy and academic resilience factors, suggesting that this measure provides complementary information on middle school and older students' experience of resilience following major adversity, as well as academic buoyancy (everyday resilience).

The nine-item ARM was sufficiently reliable (α > 0.80; reliability of person separation >0.70) with a diverse sample of 7th and 8th grade students (Ricketts, 2015; Ricketts et al., 2017), and CFA showed a good fit for a three-factor model (academic buoyancy, access to support, future goals; Ricketts, 2015). Thus, this instrument could be used in assessing a multifaceted conception of academic resilience in the mathematics context. In sum, depending on how multidimensional or focused a construct one wishes to capture, whether larger academic setbacks or everyday stressors are emphasized, and whether the focus is on general academics or domain-specific resilience, there are multiple options appropriate for middle school and older students.

High school and up

We identified nine instruments suitable for use with high school and older students; based on our analyses, we recommend the use of seven of these instruments. Lovelace (2022) reported on an adaptation of Kooken et al.'s (2016) 24-item Mathematical Resilience Scale, originally developed in the context of college-level mathematics, for use with low-income high school students in the U.S.; this study indicated that dimensions of Value, Struggle, and Growth all had good reliability (α > 0.85) and fit statistics in a three-factor CFA. Liu and Platow (2020) adapted the mathematics-focused ARM (Ricketts et al., 2017) to a more general measure of academic resilience for Chinese high school students, and this adaptation maintained good reliability (α > 0.85) and showed good fit of items to the scale based on results from a CFA; thus, these nine items could also be administered using a domain-general context.

The Adolescent Resilience Scale is a general psychological resilience instrument created by (Oshio et al. 2003) consisting of 21 items across three factors of novelty seeking, emotional regulation, and positive future orientation. Anghel (2015) evaluated this measure with a sample of urban Romanian high school students and found that the scale was sufficiently reliable (α = 0.77) and related to students' risk status, suggesting that this measure might be used in instances where a more general psychological notion of resilience is desired, versus a more focused measure of academic resilience. The Behavioral, Emotional, and Social Skills Inventory (BESSI; Soto et al., 2022) is another example of a larger social-emotional skills inventory designed for high school and older students that includes an emotional resilience facet, among other non-cognitive characteristics. Sewell et al. (2024) explored the development of shorter versions of the original 192-item BESSI and offered evidence in support of 96-, 45-, and 20-item versions with different reporting structures and testing times. Each of these shorter versions retained an emotional resilience skill domain. With a high school sample, each of the reduced-form instruments shows strong reliability (α > 0.70 across all forms) and validity evidence (i.e., good model fit and factor loadings) for emotional resilience and are recommended if a constellation of social, emotional, and behavioral characteristics is desirable to measure.

The Academic Resilience Scale (Martin and Marsh, 2003, 2006) is a six-item scale that results in a unidimensional overall academic resilience score; this instrument was shown to be highly reliable (α = 0.89) with a sample of over 400 Australian high school students, and correlations between this measure and other motivation and engagement constructs (e.g., self-beliefs, persistence, failure avoidance) were in the expected directions. This measure has successfully been adapted to several other samples including urban Romanian (Anghel, 2015), Indian (Rajan et al., 2017), Chinese (Cui et al., 2023), and Turkish (Kapikiran, 2012) high school students; Njoki (2018) adapted this instrument for a Kenyan secondary school population with a focus on mathematics. All of these administrations except for Rajan et al. (2017) showed adequate reliability (α > 0.70). When CFA analyses were conducted, the instrument showed adequate fit to a unidimensional model (Cui et al., 2023; Kapikiran, 2012; Martin and Marsh, 2006), suggesting that this measure may be successfully adapted to many contexts and, for the most part, maintain its technical properties.

Resilience was captured along with persistence in the six-item vigor subscale of Schaufeli et al.'s (2002a,b) Engagement Scale (Phan, 2016), designed to capture motivation-related attributes of engagement. The vigor subscale was reliable across time points (α = 0.86–0.87) and showed good CFA model fit and significant correlations with the other subscales of absorption (six items) and dedication (five items), based on analyses of two samples of Fijian high school students. The eight-item resilience subscale from the Design My Future (DMF) measure (Di Maggio et al., 2016) had good reliability (α = 0.80) and good fit via EFA and CFA with two samples of Italian high school students. This measure also showed significant correlations with many other student-level variables and therefore may be useful when relating resilience to other individual differences.

An adaptation of the 33-item Resilience Scale that (Wagnild and Collins 2009) developed for adult populations was used with secondary school students in Pakistan, and all dimensions and the overall scale were sufficiently reliable (α's > 0.70; Sarwar et al., 2010). However, because the rating scale was not reported, it is not apparent exactly what rating scale ought to be used. We would advocate using alternative instruments with more complete information available.

Other measures of resilience

Our review included four studies that described approaches to measuring resilience that did not fall into the behavioral or self-report categories. Specifically, we identified three studies that included teacher-report surveys of student behavior and one study that included a parent-report survey with items corresponding to teacher reports (Supplementary Table 6a). Each of the studies reported reliability and validity evidence (Supplementary Table 6b).

Two of the teacher-report measures are readily available, and evidence suggests that each instrument has high reliability. The Approaches to Learning subscale of the Social Skills Rating Survey (Gresham and Elliot, 1990; Hill, 2017), administered in the Early Childhood Longitudinal Survey (ECLS-K), captures teacher ratings of eight social skills reflecting students' approaches to learning and is appropriate for diverse samples of early elementary students (grade 3). For middle school students, the four-item resilience scale from the Mission Skills Assessment (Yang, 2014) may be an appropriate option that is efficient for teachers to administer. Both measures have been evaluated for large samples of students, and validity evidence shows that academic resilience, as reflected in teacher ratings, has significant relationships with key outcomes, such as reading achievement and GPA. The remaining measures represent the teacher and parent report surveys of the SEARS (Nese et al., 2012), which are appropriate for a wide range of grade levels (K−12) but are not freely available and therefore not recommended.

Discussion

This systematic review resulted in compilations of operational definitions and theoretical frameworks used to conceptualize the constructs of persistence and resilience, as well as measures of these constructs that can be leveraged in future research on persistence, resilience, and related constructs with K−12 students. Notably, evidence suggests that measures designed for specific disciplinary contexts may be amenable to adaptation to other domains or to domain-general adaptations (e.g., the ARM was successfully adapted from mathematics to the programming context as well as a domain-general version). Similarly, some measures may be adapted for use with different sample compositions from those reported in the specific research studies we reviewed. For example, while we focused on readily available English-language instruments, several measures reviewed were successfully developed, translated, or otherwise adapted for use in other language and cultural contexts, and similar adaptation processes could be applied in further studies.2 Such modifications will require future efforts to collect and evaluate evidence of reliability, validity, and fairness in diverse samples of K−12 students.

Alignment between construct definitions and available measures

A key consideration for selection of measures is the degree to which a given measure aligns with the researcher's conceptualization of the target construct. Accordingly, we evaluated each identified measure based on the degree to which it aligned with our synthesized construct definitions to facilitate measure selection for future research. This evaluation of alignment involved determining whether each component of the construct definition was addressed by the measure. We limited our alignment evaluation to those measures that we recommended based on our analysis of reliability, validity, and fairness evidence and did not separately consider adaptations of those measures for different contexts.

Our task persistence definition was decomposed into four components: presence of a goal-directed task, presence of challenges or difficulties, sustained effort, and task completion (see Table 5 for alignment between persistence definition components and related theoretical frameworks as previously discussed in the Literature Review). The in-game behavioral indicators (DiCerbo, 2014) directly addressed sustained effort and task completion, and the remaining two components (a goal-directed, challenging task) are implied in the task design. The recommended self-report measures varied in the degree to which they aligned with our persistence definition. APT-STEM (Sunny, 2018) and the persistence subscale of the ASES (Dullas, 2018) were the only two measures that addressed all four definition components. The presence of challenges and sustained effort were addressed in all self-report measures, whereas task completion was addressed in three measures: APT-STEM, ASES, and the persistence subscale from SSES (OECD, 2021a,b). The two other-report measures from the LBS (Discipline/Persistence, Rikoon et al., 2012; Attention/Persistence, Canivez et al., 2006) only addressed sustained effort, while researcher observations of perseverance from the 3PP framework (DiNapoli and Miller, 2022) addressed all four persistence components.

Table 5
www.frontiersin.org

Table 5. Proposed relationships between synthesized definition components and relevant theories/frameworks.

Our academic resilience definition was also decomposed into four components: presence of challenge, adversity, or stressor; use of adaptive behaviors or coping strategies; evidence of bounce back or recovery; and achievement of successful outcomes (see Table 5 for the alignment between academic resilience definition components and related theoretical frameworks as previously discussed in the Literature Review). Like persistence, the 17 self-report measures varied in their alignment with our resilience definition. Five measures covered all four definition components: ARQ (Anderson et al., 2020), Adolescent Resilience Scale (Anghel, 2015), DMF (Di Maggio et al., 2016), ARM (Ricketts et al., 2017), and Mission Skills Assessment (Yang, 2014). Five additional measures covered three definition components (excluding successful outcomes): RYDM (Furlong et al., 2009); Academic Resilience Scale (Martin and Marsh, 2006); Resilience Scale (Sarwar et al., 2010); BESSI (Sewell et al., 2024); and Resilience Scale (Sun and Stewart, 2007). The presence of challenges was the definition component addressed most frequently across self-report surveys (15 measures), followed by evidence of recovery and use of adaptive behaviors (13 measures each), while successful outcomes was the least addressed component (six measures). The two teacher-report measures from the Social Skills Rating Scale (Gresham and Elliot, 1990; Hill, 2017) and Mission Skills Assessment (Yang, 2014) both addressed evidence of recovery, whereas the presence of challenges and use of adaptive behaviors were only addressed in the Mission Skills Assessment, and successful outcomes was only addressed by the Social Skills Rating Scale.

We recommend selecting a measure that fully aligns with our definitions to provide a measure of persistence or resilience that is robust in terms of construct conceptualization and evidence of reliability, validity, and fairness. Although we believe that this alignment evaluation is informative for the measurement selection process, we do acknowledge that a more in-depth analysis of the alignment is likely warranted when selecting measures for individual studies (e.g., number of items addressing each definition component). The preceding review should serve as a guide to researchers, who ultimately will make their own best judgments about what measures to administer, creating and evaluating novel measures to capture additional construct components as needed for the aims and theoretical orientations of the research.

Individual differences in persistence and resilience

Subgroup analyses examining how measures functioned across student characteristics (e.g., age, gender, and cultural context) were scarce among the studies reviewed, making it difficult to draw strong conclusions about these relationships. Students' age (grade level) showed inconsistent relationships with persistence and resilience. While DiCerbo (2014) found increasing persistence from grade 1 to grade 9 with her in-game behavioral measure, self-report instruments indicate that younger students report higher persistence and resilience compared to high school students (OECD, 2021a), and middle-school students report higher resilience than elementary and high school students (Cohn et al., 2009). Other studies found minimal relationships between age and resilience (Furlong et al., 2009). Notably, some studies used the same instrument successfully across grade levels (e.g., SSES; OECD, 2021a), while others used different forms for different education levels, with varying results (e.g., RYDM; Hanson and Kim, 2007; Furlong et al., 2009). Further examination of how persistence and resilience vary with age—and how these constructs should be operationalized in scales designed for students of different age groups—is warranted.

Subgroup analyses by gender also indicated mixed effects for both constructs. For persistence, effects favored females over males (Dullas, 2018; OECD, 2021a) except among the SSES sample of 15-year-olds, where boys reported higher persistence (OECD, 2021a), but few studies examined gender effects. Gender comparisons were more prevalent for resilience, with some studies favoring females (Cohn et al., 2009, elementary and adolescent cohorts; Furlong et al., 2009; Hanson and Kim, 2007; Liebenberg et al., 2012; Rajan et al., 2017; Ricketts et al., 2017), some favoring males (Di Maggio et al., 2016; OECD, 2021a; Sarwar et al., 2010), and some showing no or minimal relationships (Kapikiran, 2012; Liu and Platow, 2020; Lovelace, 2022) or mixed effects (e.g., female students reported higher levels than males on only two subscales; Anderson et al., 2020). Additionally, Cui et al. (2023) reported factorial invariance of resilience measures by gender, while gender accounted for 2.3% of RYDM score variance (Furlong et al., 2009). Future research is necessary to clarify the extent to which persistence and academic resilience vary by gender.

Few studies specifically examined differences in persistence and resilience across cultural contexts. As described above, the SSES showed metric invariance across cities for both constructs, and variation in levels of persistence and resilience and their relationships to other constructs was reported (OECD, 2021a,b), while the CYRM showed different factor structures across majority and minority world samples (Ungar and Liebenberg, 2011). Although we limited our search to English-language publications, our review is global in scope, with 18 countries represented across the remaining studies. However, subgroup analyses by race/ethnicity within each study were also limited. While a few persistence studies included racially diverse samples (i.e., majority non-white), no subgroup comparisons were reported (Canivez et al., 2006; Sunny, 2018). Several studies of resilience included racially and ethnically diverse samples but most did not report subgroup comparisons (Anderson et al., 2020; Ricketts, 2015; Ricketts et al., 2017; Sewell et al., 2024; Soto et al., 2022). Among studies reporting effects by race/ethnicity, results were mixed, with some reporting higher resilience scores for minority youth (Liebenberg et al., 2012), some reporting higher resilience among white students (Furlong et al., 2009), some reporting significant differences in all subscales (Hanson and Kim, 2007), and others reporting subgroup differences on some subscales but not others (Lovelace, 2022). Hill (2017) also reported that white students had stronger relationships between academic resilience and other social-emotional behaviors compared to Black and Hispanic students, while the relationship between academic resilience and reading achievement was stronger for Hispanic than white students, based on teacher reports of resilience.

In sum, studies in this review did not consistently report evidence of how well measures and instruments functioned across different subgroups of students, and the kinds and strength of evidence varied. These findings underscores the importance of establishing reliable and valid measures of persistence and academic resilience to enable researchers to build a body of evidence of how these instruments function across diverse samples. Such evidence is especially important to obtain and report as the K−12 population becomes more heterogeneous.

Relationships among persistence, resilience, and other social-emotional constructs

Considering the relationship between persistence and resilience, it is notable that Skinner et al. (2013) emphasize that “ongoing engagement” (i.e., persistent engagement) allows students to adaptively cope and to re-engage with challenging tasks in the face of setbacks, suggesting that over time, persistence may foster resilience (Skinner and Pitzer, 2012). Indeed, our review of validity evidence indicated that when measures of both constructs were included in the same study, persistence was both a correlate and a predictor of academic resilience (Martin and Marsh, 2003, 2006; OECD, 2021a; Porter et al., 2020) and related constructs (i.e., academic buoyancy; Martin, 2009). Persistence was also effectively measured as a subcomponent of academic resilience (Ramezanpour et al., 2019), and found to predict later persistence/resilience (e.g., “vigor” as measured in Phan, 2016). Thus, it may be helpful to conceive of persistence as a shorter-term construct (i.e., persisting to task completion) and resilience as more of a longer-term process (or outcome) that emerges from persisting through and recovering from various challenges and setbacks experienced over time. It is likely that many educational learning tools and interventions may have greater impact on nearer-term “everyday” stressors as opposed to more acute or chronic academic issues; however, this remains an open question. Future work should include distinct measures of both constructs in the same study to further elucidate empirical relationships between persistence and academic resilience.

Notably, this review highlights the importance of examining other social-emotional constructs in relation to persistence and resilience. As noted in the introduction to this paper, we observed great diversity in the theoretical perspectives on persistence and academic resilience which articulated relationships among these constructs within a broader ecology of social-emotional learning (SEL) competencies (e.g., Skinner et al., 2022). Attempts to measure or support development of persistence or academic resilience constructs in isolation may not be as effective as those that consider how such constructs interact within a larger ecosystem of students' personal characteristics, beliefs, and behaviors (Farrington et al., 2012).

The validity evidence reviewed for each measure indicated empirical interrelations among these constructs and other crucial SEL constructs. For example, persistence was predicted by self-efficacy (Bae, 2014; Phan, 2016), perceived ability (Miller et al., 1996), mastery goals (Bae, 2014) or learning goals (Miller et al., 1996), and was correlated with constructs like academic self-concept (Yeung et al., 2009), self-regulation (Green et al., 2007; Martin, 2007, 2009; Miller et al., 1996), challenge-seeking (Porter et al., 2020), academic aspirations (Green et al., 2007; Martin, 2007; OECD, 2021a), valuing school or subject matter (Green et al., 2007; Martin, 2007, 2009), among others.

Similarly, academic resilience was predicted by perceived academic competence (Liu and Platow, 2020), learning from failure (Wei et al., 2023), and psychological security (Wei et al., 2023), and was correlated with coping strategies (Skinner et al., 2013), creativity and curiosity (OECD, 2021a; Di Maggio et al., 2016), intrinsic motivation (Kapikiran, 2012; Porter et al., 2020), challenge seeking (Porter et al., 2020), sense of belonging in school (Furlong et al., 2009; Hanson and Kim, 2007; OECD, 2021a; Uslu, 2023), and valuing school or subject matter (Burger et al., 2012; Martin and Marsh, 2003), among others. Academic resilience showed evidence of bidirectional predictive relationships with self-efficacy (Burger et al., 2012; Cui et al., 2023; Di Maggio et al., 2016; Martin and Marsh, 2003, 2006; Rajan et al., 2017; Phan, 2016), self-regulation (Martin and Marsh, 2003, 2006; Ramezanpour et al., 2019; Ricketts, 2015; Sari et al., 2022), engagement and participation (Cui et al., 2023; Lovelace, 2022; Martin, 2013; Martin and Marsh, 2006, 2008; Soto et al., 2022; Uslu, 2023), and anxiety (Anderson et al., 2020; Martin, 2013; Martin and Marsh, 2003, 2006, 2008; OECD, 2021a). Academic resilience predicted life satisfaction and well-being (Soto et al., 2022; Yang, 2014), academic enjoyment (Martin and Marsh, 2006), career interests (Soto et al., 2022), and volunteerism (Soto et al., 2022).

While these are not exhaustive lists, the review of validity evidence suggests that persistence and resilience are closely interconnected and may interact with many SEL constructs. Future research would do well to consider multiple relevant constructs and use complementary measures to further elucidate critical relationships among academic and motivational constructs that may influence attempts to promote persistence and academic resilience of K−12 students.

Implications for future research and practice

The results of this review can have utility for the wider fields of educational psychology and social emotional learning. Researchers may use these results to justify selection of other measures used for their own purposes, or to justify development of new surveys or behavioral measures that align with our proposed construct definitions and are appropriate for particular use cases within K−12 educational research and practice. In addition to leveraging domain-general measures, researchers may opt to include one or more persistence or academic resilience measures framed within domain-specific contexts to obtain further evidence of how persistence and academic resilience are related to disciplinary learning and to further evaluate construct validity of the selected instruments using novel samples, or adapted to novel domain contexts.

Our review indicated that there are only a few examples of behavioral measures of persistence (and none for resilience) that are readily available, valid, and reliable. The limited availability of behavioral measures may be due in part to methodological challenges (e.g., reliance on traditional survey methodologies and single time-point administrations, versus interactive performance tasks with repeated or longitudinal observations which may be necessary to observe longer-term resilience processes as we have defined them here). We would therefore especially advocate for further exploration and innovation in behavioral measures of resilience that can be validated against one or more of the many, more established self-report instruments available for use in educational research with K−12 populations. Such research can inform the selection of both behavioral and self-report instruments to inform the dynamic modeling of resilience and other related constructs as they develop over time.

In terms of self-report measures, we recommend reliance on instruments that have been validated with large, diverse samples of students with the same target age range that researchers intend to study. Researchers may opt to use focused and specific instruments, or broad, multifaceted psychological instruments to measure persistence and academic resilience in conjunction with other social, emotional, and behavioral constructs of interest, sometimes within the same larger survey instrument. Depending on researchers' interests, they may opt to use subscales from larger instruments independently or in conjunction with other relevant subscales. We caution researchers who administer subscales separate from their larger context to collect and report appropriate validity evidence for the subscale, due to differences in the psychometric properties that may be observed when extracting subscales from their larger contexts.

Based on the results of this review we also recommend the administration of multiple measures in differing formats (e.g., multiple self-report instruments coupled with novel or established behavioral indicators) to aid in understanding how the measures perform and to further contribute to the available evidence regarding the construct validity of the instruments used. Specifically, given the results of the review of behavioral measures, we advocate for including behavioral measures in addition to self-report measures in the same study, because few studies combined multiple measures of the same construct collected via different methods. Collecting both types of data will enable researchers to generate additional validity evidence to further support use of behavioral and self-report measures selected for administration and, importantly, help to validate existing and novel behavioral indicators.

Finally, we would also advocate for researchers to thoroughly document and report multiple types of reliability and validity evidence as appropriate for their research context(s). This includes dimensionality analyses (i.e., exploratory or confirmatory factor analyses, depending on whether there is theory guiding the instrument design), examinations of convergent and divergent validity evidence (i.e., relationships with related and unrelated SEL constructs), and subgroup or multigroup analyses that aid researchers in determining whether the instruments demonstrate comparable reliability and validity when used with diverse samples. Robust investigations of reliability and validity will help to ensure that selected measures meet standards for technical quality, thus enabling researchers, practitioners, and policymakers to make more valid and appropriate inferences about students' levels of persistence and academic resilience. Because relatively few measures were developed and administered in large-scale contexts (i.e., samples over 1,000 students; Anderson et al., 2020; Di Maggio et al., 2016; Hanson and Kim, 2007; Hill, 2017; Fang et al., 2020; Furlong et al., 2009; Lovelace, 2022; Nese et al., 2012; OECD, 2021a,b; Tan et al., 2014; Ungar and Liebenberg, 2011; Yang, 2014), we recommend that the instruments reported here be used primarily for low-stakes, formative use cases that inform research and instruction unless appropriate validity evidence for high-stakes summative uses is obtained. Educators and policymakers wishing to implement these measures in their contexts are advised to partner with research teams and assessment specialists to ensure valid and responsible use and interpretation of resulting assessment data.

Data availability statement

The original contributions presented in the study are included in the article/Supplementary material, further inquiries can be directed to the corresponding author.

Author contributions

JS: Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Project administration, Supervision, Validation, Writing – original draft, Writing – review & editing. BL: Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Project administration, Validation, Writing – original draft, Writing – review & editing. JG: Conceptualization, Investigation, Methodology, Writing – original draft, Writing – review & editing. SZ: Conceptualization, Investigation, Writing – original draft, Writing – review & editing. NS: Conceptualization, Data curation, Investigation, Methodology, Writing – original draft, Writing – review & editing. MI: Conceptualization, Investigation, Methodology, Writing – review & editing.

Funding

The author(s) declare that financial support was received for the research and/or publication of this article. This material is based upon work supported by the National Science Foundation and the Institute of Education Sciences under Grant DRL-2229612. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation or the U.S. Department of Education.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Generative AI statement

The author(s) declare that no Gen AI was used in the creation of this manuscript.

Any alternative text (alt text) provided alongside figures in this article has been generated by Frontiers with the support of artificial intelligence and reasonable efforts have been made to ensure accuracy, including review by the authors wherever possible. If you identify any issues, please contact us.

Publisher's note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

Supplementary material

The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/feduc.2025.1673500/full#supplementary-material

Footnotes

1. ^We categorically excluded studies where grit was the only or primary measure of persistence due to conceptual and methodological challenges associated with this construct noted in prior literature (see Credé et al., 2017; Muenks et al., 2017). Note that two studies focusing on resilience (Buathong, 2019; Cui et al., 2023) also administered a grit scale to examine relationships among resilience and related constructs (i.e., convergent validity). These studies were retained under the resilience review but excluded from the persistence review.

2. ^We acknowledge the focus on publications and instruments available in English as a limitation of the current review, and invite other scholars working in other language contexts to extend this review to publications and instruments available in other languages.

* ^Denotes papers included in the systematic review.

References

Abrahams, L., Pancorbo, G., Primi, R., Santos, D., Kyllonen, P., John, O. P., et al. (2019). Social-emotional skill assessment in children and adolescents: advances and challenges in personality, clinical, and educational contexts. Psychol. Assess. 31, 460–473. doi: 10.1037/pas0000591

PubMed Abstract | Crossref Full Text | Google Scholar

AERA, NCME, and APA. (2014). Standards for Educational and Psychological Testing. Washington, DC: American Educational Research Association.

Google Scholar

Akkermans, J., Brenninkmeijer, V., van den Bossche, S. N., Blonk, R. W., and Schaufeli, W. B. (2013). Young and going strong?: a longitudinal study on occupational health among young employees of different educational levels. Career Dev. Int. 18, 416–435. doi: 10.1108/CDI-02-2013-0024

Crossref Full Text | Google Scholar

*Anderson, J. R., Killian, M., Hughes, J. L., Rush, A. J., and Trivedi, M. H. (2020). The adolescent resilience questionnaire: validation of a shortened version in U.S. Youths. Front. Psychol. 11:606373. doi: 10.3389/fpsyg.2020.606373

PubMed Abstract | Crossref Full Text | Google Scholar

Anderson, P. (2002). Assessment and development of executive function (EF) during childhood. Child Neuropsychol. 8, 71–82. doi: 10.1076/chin.8.2.71.8724

PubMed Abstract | Crossref Full Text | Google Scholar

Andersson, H., and Bergman, L. R. (2011). The role of task persistence in young adolescence for successful educational and occupational attainment in middle adulthood. Dev. Psychol. 47, 950–960. doi: 10.1037/a0023786

PubMed Abstract | Crossref Full Text | Google Scholar

*Anghel, R. A. (2015). Psychological and educational resilience in high vs. low-risk Romanian adolescents. Procedia Soc. Behav. Sci. 203, 153–157. doi: 10.1016/j.sbspro.2015.08.274

Crossref Full Text | Google Scholar

Antonovsky, A. (1987). Unraveling the Mystery of Health: How People Manage Stress and Stay Well. San Francisco, CA: Jossey-Bass.

Google Scholar

Antonovsky, A. (1996). The salutogenic model as a theory to guide health promotion. Health Promot. Int. 11, 11–18. doi: 10.1093/heapro/11.1.11

Crossref Full Text | Google Scholar

ASReview LAB Developers (2023). ASReview LAB Software Documentation (v1.3). Geneva: Zenodo.

Google Scholar

Atkinson, J. W. (1957). Motivational determinants of risk-taking behavior. Psychol. Rev. 64(6, Pt.1), 359–372. doi: 10.1037/h0043445

PubMed Abstract | Crossref Full Text | Google Scholar

*Bae, Y. (2014). The relationships among motivation, self-regulated learning, and academic achievement (153862) (Doctoral dissertation). Texas A&M University, College Station, TX, USA. Available online at: https://hdl.handle.net/1969.1/153862 (Accessed December 13, 2023).

Google Scholar

Baker, R. S., Cloude, E., Andres, J. M. A. L., and Wei, Z. (2025). The confrustion constellation: a new way of looking at confusion and frustration. Cogn. Sci. 49:e70035. doi: 10.1111/cogs.70035

PubMed Abstract | Crossref Full Text | Google Scholar

Bakker, A. B., and Demerouti, E. (2008). Towards a model of work engagement. Career Dev. Int. 13, 209–223. doi: 10.1108/13620430810870476

Crossref Full Text | Google Scholar

Bandura, A. (1978). The self system in reciprocal determinism. Am. Psychol. 33, 344–358. doi: 10.1037/0003-066X.33.4.344

Crossref Full Text | Google Scholar

Bandura, A. (1986). Social Foundations of Thought and Action: A Social Cognitive Theory. Englewood Cliffs, NJ: Prentice-Hall.

Google Scholar

Bandura, A. (1994). “Self-efficacy,” in Encyclopedia of Human Behavior, Vol. 4, ed. V. S. Ramachandran (New York, NY: Academic Press), 71–81.

Google Scholar

Bandura, A. (1997). Self-Efficacy: The Exercise of Control. New York, NY: Freeman.

Google Scholar

Bandura, A. (2006). Toward a psychology of human agency. Perspect. Psychol. Sci. 1, 164–180. doi: 10.1111/j.1745-6916.2006.00011.x

PubMed Abstract | Crossref Full Text | Google Scholar

Bashant, J. (2014). Developing grit in our students: why grit is such a desirable trait, and practical strategies for teachers and schools. J. Leadersh. Instr. 13, 14–17. Available online at: http://files.eric.ed.gov/fulltext/EJ1081394.pdf

Google Scholar

Beaver, B. R. (2008). A positive approach to children?s internalizing problems. Profess. Psychol. Res. Pract. 39, 129–136. doi: 10.1037/0735-7028.39.2.129

Crossref Full Text | Google Scholar

Benard, B. (2004). Resiliency: What We Have Learned. San Francisco, CA: WestEd.

Google Scholar

Benard, B., and Slade, S. (2009). “Listening to students: moving from resilience research to youth development practice and school connectedness,” in Handbook of Positive Psychology in the Schools, eds. R. Gilman, E. S. Huebner, and M. J. Furlong (New York, NY: Routledge), 353–370.

Google Scholar

Bender, W. N., and Wall, M. E. (1994). Social-emotional development of students with learning disabilities. Learn. Disabil. Q. 17, 323–341. doi: 10.2307/1511128

Crossref Full Text | Google Scholar

*Boe, E. E., May, H., and Boruch, R. (2002). Student Task Persistence in the Third International Mathematics and Science Study: A Major Source of Achievement Differences at the National, Classroom, and Student Levels. CRESP Research Report. Available online at: http://repository.upenn.edu/gse_pubs/412 (Accessed February 3, 2024).

Google Scholar

Bonanno, G. A. (2004). Loss, trauma, and human resilience: have we underestimated the human capacity to thrive after extremely aversive events? Am. Psychol. 59, 20–28. doi: 10.1037/0003-066x.59.1.20

Crossref Full Text | Google Scholar

Bronfenbrenner, U. (1979). The Ecology of Human Development: Experiments by Nature and Design. Cambridge: Harvard University Press.

Google Scholar

Bronfenbrenner, U., and Morris, P. A. (2006). “The bioecological model of human development,” in Handbook of Child Psychology, eds. W. Damon, R. M. Lerner, and R. M. Lerner (New York, NY: Wiley). doi: 10.1002/9780470147658.chpsy0114

Crossref Full Text | Google Scholar

*Buathong, P. (2019). The effect of a growth mindset intervention on underprivileged students' English intelligence mindset and academic resilience with perceived teacher support as a moderator (Unpublished master's thesis). Bangkok: Chilalongkorn University.

Google Scholar

*Burger, J. M., Nadirova, A., and Keefer, K. V. (2012). Moving beyond achievement data: development of the Student Orientation to School Questionnaire as a non-cognitive assessment tool. J. Psychoeduc. Assess. 30, 367–386. doi: 10.1177/0734282912449444

Crossref Full Text | Google Scholar

Campos, D. G., Fütterer, T., Gfrörer, T., Lavelle-Hill, R., Murayama, K., König, L., et al. (2024). Screening smarter, not harder: a comparative analysis of machine learning screening algorithms and heuristic stopping criteria for systematic reviews in educational research. Educ. Psychol. Rev. 36:19. doi: 10.1007/s10648-024-09862-5

Crossref Full Text | Google Scholar

*Canivez, G. L., Willenborg, E., and Kearney, A. (2006). Replication of the Learning Behaviors Scale factor structure with an independent sample. J. Psychoeduc. Assess. 24, 97–111. doi: 10.1177/0734282905285239

Crossref Full Text | Google Scholar

Caporale-Berkowitz, N. A., Boyer, B. P., Muenks, K., and Brownson, C. B. (2022). Resilience, not grit, predicts college student retention following academic probation. J. Educ. Psychol. 114, 1654–1669. doi: 10.1037/edu0000721

Crossref Full Text | Google Scholar

CASEL (2020). Core SEL Competencies. Available online at: https://casel.org/core-competencies/ (Retrieved October 20, 2025).

Google Scholar

Cassidy, S. (2015). Resilience building in students: the role of academic self-efficacy. Front. Psychol. Educ. Psychol. 6:1781. doi: 10.3389/fpsyg.2015.01781

PubMed Abstract | Crossref Full Text | Google Scholar

Cassidy, S. (2016). The academic resilience scale (ARS-30): a new multidimensional construct measure. Front. Psychol. Educ. Psychol. 7:1787. doi: 10.3389/fpsyg.2016.01787

PubMed Abstract | Crossref Full Text | Google Scholar

*Chichekian, T., and Vallerand, R. J. (2022). Passion for science and the pursuit of scientific studies: the mediating role of rigid and flexible persistence and activity involvement. Learn. Individ. Differ. 93:102104. doi: 10.1016/j.lindif.2021.102104

Crossref Full Text | Google Scholar

Cicchetti, D. (2003). “Forward,” in Resilience and Vulnerability: Adaption in the Context of Childhood Adversities, ed. S. S. Luthar (Cambridge: Cambridge University Press), xix–xxvii.

Google Scholar

Cicchetti, D., and Lynch, M. (1993). Toward an ecological/transactional model of community violence and child maltreatment: consequences for children?s development. Psychiatry 56, 96–118. doi: 10.1080/00332747.1993.11024624

PubMed Abstract | Crossref Full Text | Google Scholar

Cloninger, C. R., Przybeck, T., Svrakic, D. M., and Wetzel, R. (1994). Manual of the Temperament and Character Inventory (TCI): A Guide to Its Development and Use. St. Louis, MO: Center for Psychobiology of Personality; Washington University.

Google Scholar

Cloninger, C. R., Svrakic, D. M., and Przybeck, T. R. (1993). A psychobiological model of temperament and character. Archiv. Gen. Psychiatry 50, 975–990. doi: 10.1001/archpsyc.1993.01820240059008

PubMed Abstract | Crossref Full Text | Google Scholar

*Cohn, B., Merrell, K.W., Felver-Gant, J., Tom, K., and Endrulat, N. R. (2009). Strength-based assessment of social and emotional functioning: SEARS-C and SEARS-A. Paper presented at the meeting of the National Association of School Psychologists, Boston, MA. Available online at: https://citeseerx.ist.psu.edu/document?repid=rep1andtype=pdfanddoi=13f123580109e4f189b4d240cc12e5d4ac70939a (Accessed July 12, 2024).

Google Scholar

Collaborative for Academic Social, and Emotional Learning (CASEL). (2012). 2013 CASEL Guide: Effective Social and Emotional Learning Programs: Preschool and Elementary School Edition. Available online at: http://files.eric.ed.gov/fulltext/ED581699.pdf

Google Scholar

Connell, J. P., and Wellborn, J. G. (1991). “Competence, autonomy, and relatedness: a motivational analysis of self-system processes,” in Minnesota Symposium on Child Psychology: Vol. 23. Self Processes in Development, eds. M. R. Gunnar and L. A. Sroufe (Chicago, IL: University of Chicago Press), 43-77.

Google Scholar

Constantin, T., Holman, A., and Hojbotǎ, A. M. (2011). Development and validation of a motivational persistence scale. Psihologija 45, 99–120. doi: 10.2298/PSI1202099C

Crossref Full Text | Google Scholar

Cook, C. R., Lyon, A. R., Locke, J., Waltz, T., and Powell, B. J. (2019). Adapting a compilation of implementation strategies to advance school-based implementation research and practice. Prev. Sci. 20, 914–935. doi: 10.1007/s11121-019-01017-1

PubMed Abstract | Crossref Full Text | Google Scholar

Credé, M., and Kuncel, N. R. (2008). Study habits, skills, and attitudes: the third pillar supporting collegiate academic performance. Perspect. Psychol. Sci. 3, 425–453. doi: 10.1111/j.1745-6924.2008.00089.x

PubMed Abstract | Crossref Full Text | Google Scholar

Credé, M., Tynan, M. C., and Harms, P. D. (2017). Much ado about grit: a meta-analytic synthesis of the grit literature. J. Pers. Soc. Psychol. 113, 492–511. doi: 10.1037/pspp0000102

PubMed Abstract | Crossref Full Text | Google Scholar

*Cui, T., Wang, C., and Xu, J. (2023). Validation of academic resilience scales adapted in a collective culture. Front. Psychol. 14:1114285. doi: 10.3389/fpsyg.2023.1114285

PubMed Abstract | Crossref Full Text | Google Scholar

Danner, F. W., and Lonky, E. (1981). A cognitive-developmental approach to the effects of rewards on intrinsic motivation. Child Dev. 52, 1043–1052. doi: 10.2307/1129110

Crossref Full Text | Google Scholar

Deci, E. L., and Ryan, R. M. (1985). Intrinsic Motivation and Self-Determination in Human Behavior. New York, NY: Plenum.

Google Scholar

Deci, E. L., and Ryan, R. M. (2000). The “what” and “why” of goal pursuits: human needs and the self-determination of behavior. Psychol. Inq. 11, 227–268. doi: 10.1207/S15327965PLI1104_01

Crossref Full Text | Google Scholar

Deci, E. L., and Ryan, R. M. (2017). Self-Determination Theory: Basic Psychological Needs in Motivation, Development, and Wellness. Guilford Press.

Google Scholar

Deci, E. L., and Ryan, R. M., (2002). The Handbook of Self-Determination Research. Rochester, NY: University of Rochester Press.

Google Scholar

*Di Maggio, I., Ginevra, M., Nota, L., and Soresi, S (2016). Development and validation of an instrument to assess future orientation and resilience in adolescence. J. Adolesc. 51, 114–122. doi: 10.1016/j.adolescence.2016.06.005

PubMed Abstract | Crossref Full Text | Google Scholar

*DiCerbo, K. E. (2014). Game-based assessment of persistence. J. Educ. Technol. Soc. 17, 17–28. Available online at: https://psycnet.apa.org/record/2014-06607-003

Google Scholar

DiCerbo, K. E. (2016). “Assessment of task persistence,” in Handbook of Research on Technology Tools for Real-World Skill Development, eds. Y. Rosen, S. Ferrara, and M. Mosharraf (Hershey, PA: IGI Global), 778–804. doi: 10.4018/978-1-4666-9441-5.ch030

Crossref Full Text | Google Scholar

Diener, C. I., and Dweck, C. S. (1978). An analysis of learned helplessness: continuous changes in performance, strategy, and achievement cognitions following failure. J. Pers. Soc. Psychol. 36, 451–462. doi: 10.1037/0022-3514.36.5.451

Crossref Full Text | Google Scholar

Diener, C. I., and Dweck, C. S. (1980). An analysis of learned helplessness: II. The processing of success. J. Pers. Soc. Psychol. 39, 940–952. doi: 10.1037/0022-3514.39.5.940

Crossref Full Text | Google Scholar

Digman, J. M. (1990). Personality structure: emergence of the five-factor model. Annu. Rev. Psychol. 41, 417–440. doi: 10.1146/annurev.ps.41.020190.002221

Crossref Full Text | Google Scholar

DiNapoli, J. (2023). Distinguishing between grit, persistence, and perseverance for learning mathematics with understanding. Educ. Sci. 13:402. doi: 10.3390/educsci13040402

Crossref Full Text | Google Scholar

*DiNapoli, J., and Miller, E. K. (2022). Recognizing, supporting, and improving student perseverance in mathematical problem-solving: the role of conceptual thinking scaffolds. J. Math. Behav. 66:100965. doi: 10.1016/j.jmathb.2022.100965

Crossref Full Text | Google Scholar

*Donnon, T., and Hammond, W. (2007). A psychometric assessment of the self reported youth resiliency assessing developmental strengths questionnaire. Psychol. Rep. 100, 963–978. doi: 10.2466/pr0.100.3.963-978

Crossref Full Text | Google Scholar

Duckworth, A. L., Peterson, C., Matthews, M. D., and Kelly, D. R. (2007). Grit: perseverance and passion for long-term goals. J. Pers. Soc. Psychol. 92, 1087–1101. doi: 10.1037/0022-3514.92.6.1087

PubMed Abstract | Crossref Full Text | Google Scholar

Duckworth, A. L., and Quinn, P. D. (2009). Development and validation of the Short Grit Scale (Grit-S). J. Pers. Assess. 91, 166–174. doi: 10.1080/00223890802634290

PubMed Abstract | Crossref Full Text | Google Scholar

Duckworth, A. L., and Yeager, D. S. (2015). Measurement matters: assessing personal qualities other than cognitive ability for educational purposes. Educ. Res. 44, 237–251. doi: 10.3102/0013189X15584327

PubMed Abstract | Crossref Full Text | Google Scholar

Dudley, N. M., Orvis, K. A., Lebiecki, J. E., and Cortina, J. M. (2006). A meta-analytic investigation of conscientiousness in the prediction of job performance: examining the intercorrelations and the incremental validity of narrow traits. J. Appl. Psychol. 91:40. doi: 10.1037/0021-9010.91.1.40

PubMed Abstract | Crossref Full Text | Google Scholar

Dugan, T., and Coles, R. (1989). The Child in Our Times: Studies in the Development of Resiliency. New York, NY: Brunner/Mazel.

Google Scholar

*Dullas, A. R. (2018). The development of academic self-efficacy scale for filipino junior high school students. Front. Educ. 3:19. doi: 10.3389/feduc.2018.00019

Crossref Full Text | Google Scholar

Dumdumaya, C. E. (2018). “Modeling student persistence in a learning-by-teaching environment,” in Proceedings of the 26th Conference on User Modeling, Adaptation and Personalization (UMAP '18) (New York, NY: Association for Computing Machinery). doi: 10.1145/3209219.3213596

Crossref Full Text | Google Scholar

Dusenbury, L., Calin, S., Domitrovich, C., and Weissberg, R. P. (2015). What Does Evidence-Based Instruction in Social and Emotional Learning Actually Look Like in Practice? A Brief on Findings from CASEL's Program Reviews. Chicago, IL: Collaborative for Academic, Social, and Emotional Learning.

Google Scholar

Dweck, C. S. (1999). Self-Theories: Their Role in Motivation, Personality, and Development. Philadelphia, PA: Taylor and Francis/Psychology Press.

Google Scholar

Dweck, C. S. (2006). Mindset: The New Psychology of Success. New York, NY: Ballantine Books.

Google Scholar

Dweck, C. S., and Leggett, E. L. (1988). A social-cognitive approach to motivation and personality. Psychol. Rev. 95, 256–273. doi: 10.1037/0033-295X.95.2.256

Crossref Full Text | Google Scholar

Elias, M. J., Zins, J. E., Weissberg, R. P., Frey, K. S., Greenberg, M. T., Haynes, N. M., et al. (1997). Promoting Social and Emotional Learning: Guidelines for Educators. Association for Supervision and Curriculum Development.

Google Scholar

Epstein, M. H., and Sharma, H. M. (1998). Behavioral and Emotional Rating Scale: A Strength Based Approach to Assessment. Austin, TX: PRO-ED.

Google Scholar

*Esen-Aygun, H., and Sahin-Taskin, C. (2017). Identifying psychometric properties of the social-emotional learning skills scale. Educ. Policy Anal. Strategic Res. 12, 43–61. Available online at: https://epasr.inased.org/makale/308

Google Scholar

Eysenck, H. J. (1953). The Structure of Human Personality. London: Methuen.

Google Scholar

Fallon, C. M. (2010). School Factors that Promote Academic Resilience in Urban Latino High School Students. Chicago, IL: Loyola University Chicago.

Google Scholar

*Fang, G., Chan, P. W. K., and Kalogeropoulos, P. (2020). Social support and academic achievement of Chinese low-income children: a mediation effect of academic resilience. Int. J. Psychol. Res. 13, 19–28. doi: 10.21500/20112084.4480

PubMed Abstract | Crossref Full Text | Google Scholar

*Fang, Y, Xu, Y. J., Nye, B., Graesser, A., Pavlik, P., and Hu, X. (2017). “Online learning persistence and academic achievement,” in Proceedings of the 10th International Conference on Educational Data Mining. Available online at: https://eric.ed.gov/?id=ED596608 (Accessed February 23, 2024).

Google Scholar

Farrington, C. A., Roderick, M., Allensworth, E., Nagaoka, J., Keyes, T. S., Johnson, D. W., et al. (2012). Teaching Adolescents to Become Learners. The Role of Noncognitive Factors in Shaping School Performance: A Critical Literature Review. Chicago, IL: University of Chicago Consortium on Chicago School Research.

Google Scholar

Feather, N. T. (1962). The study of persistence. Psychol. Bull. 59, 84–115. doi: 10.1037/h0042645

PubMed Abstract | Crossref Full Text | Google Scholar

Flach, F. F. (1989). Resilience: Discovering a New Strength at Times of Stress. New York, NY: Ballantine Books.

Google Scholar

Fletcher, D., and Sarkar, M. (2013). Psychological resilience: a review and critique of definitions, concepts, and theory. Eur. Psychol. 18, 12–23. doi: 10.1027/1016-9040/a000124

Crossref Full Text | Google Scholar

Frederickson, B. L. (2001). The role of positive emotions in positive psychology: the broaden-and-build theory of positive emotions. Am. Psychol. 56, 218–226. doi: 10.1037//0003-066x.56.3.218

PubMed Abstract | Crossref Full Text | Google Scholar

*Furlong, M. J., Ritchey, K. M., and O'Brennan, L. M. (2009). Developing norms for the California Resilience Youth Development module: internal assets and school resources subscales. Calif. Sch. Psychol. 14, 35–46. doi: 10.1007/BF03340949

Crossref Full Text | Google Scholar

Gamlem, S. M., Kvinge, L. M., Smith, K., Engelsen, K. S., and Hansen, V. L. (2019). Developing teachers' responsive pedagogy in mathematics: does it lead to short-term effects on student learning? Cogent Educ. 6:1676568. doi: 10.1080/2331186X.2019.1676568

Crossref Full Text | Google Scholar

García-Martínez, I., Augusto-Landa, J.M., Quijano-López, R., and León, S.P. (2022). Self-concept as a mediator of the relation between university students' resilience and academic achievement. Front. Psychol. 12:747168. doi: 10.3389/fpsyg.2021.747168

PubMed Abstract | Crossref Full Text | Google Scholar

Garmezy, N. (1985). “Stress-resistant children: the search for protective factors,” in Recent Research in Developmental Psychopathology (Journal of Child Psychology and Psychiatry Book Supplement No. 4), ed. J. E. Stevenson (Oxford: Pergamon Press), 213–233.

Google Scholar

Garmezy, N., Masten, A. S., and Tellegen, A. (1984). The study of stress and competence in children: a building block for developmental psychopathology. Child Dev. 55, 97–111. doi: 10.2307/1129837

PubMed Abstract | Crossref Full Text | Google Scholar

*Gartland, D., Bond, L., Olsson, C.A., Buzwell, S., and Sawyer, S. M. (2011). Development of a multi-dimensional measure of resilience in adolescents: the Adolescent Resilience Questionnaire. BMC Med. Res. Methodol. 11:134. doi: 10.1186/1471-2288-11-134

PubMed Abstract | Crossref Full Text | Google Scholar

Goldberg, L. R., Johnson, J. A., Eber, H. W., Hogan, R., Ashton, M. C., Cloninger, C. R., et al. (2006). The International Personality Item Pool and the future of public-domain personality measures. J. Res. Pers. 40, 84–96. doi: 10.1016/j.jrp.2005.08.007

Crossref Full Text | Google Scholar

Grawemeyer, B., Mavrikis, M., Holmes, W., Gutiérrez-Santos, S., Wiedmann, M., and Rummel, N. (2017). Affective learning: improving engagement and enhancing learning with affect-aware feedback. User Model. User-adapt. Interact. 27, 119–158. doi: 10.1007/s11257-017-9188-z

Crossref Full Text | Google Scholar

*Green, A. L., Ferrante, S., Boaz, T. L., Kutash, K., and Wheeldon-Reece, B. (2021). Evaluation of the SPARK child mentoring program: a social and emotional learning curriculum for elementary students. J. Prim. Prev. 42, 531–547. doi: 10.1007/s10935-021-00642-3

Crossref Full Text | Google Scholar

*Green, J., Martin, A. J., and Marsh, H. W. (2007). Motivation and engagement in English, mathematics and science high school subjects: towards an understanding of multidimensional domain specificity. Learn. Individ. Differ. 17, 269–279. doi: 10.1016/j.lindif.2006.12.003

Crossref Full Text | Google Scholar

Gresham, F. M., and Elliot, S. N. (1990). Social Skills Rating System (SSRS). [Database record]. APA PsycTests. Available online at: https://psycnet.apa.org (Accessed February 23, 2024). doi: 10.1037/t10269-000

Crossref Full Text | Google Scholar

*Hanson, T. L., and Kim, J. O. (2007). Measuring Resilience and Youth Development: The Psychometric Properties of the Healthy Kids Survey (Issues and Answers Report, REL 2007–No. 034). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory West. Available online at: http://ies.ed.gov/ncee/edlabs (Accessed February 23, 2024).

Google Scholar

Hattie, J. (2009). Visible Learning: A Synthesis of Meta-Analyses Relating to Achievement. New York, NY: Routledge.

Google Scholar

Hebb, D. O. (1949). The Organization of Behavior. New York, NY: Wiley.

Google Scholar

*Hill, C. (2017). The influence of home environmental factors, socio-emotional factors and academic resilience on reading achievement (Unpublished doctoral dissertation). Bozeman, MT: Montana State University.

Google Scholar

Hu, Y.-Q., and Gan, Y.-Q. (2008). Development and psychometric validity of the Resilience Scale for Chinese Adolescents. Acta Psychol. Sin. 40, 902–912. doi: 10.3724/SP.J.1041.2008.00902

Crossref Full Text | Google Scholar

Hunsu, N. J., Oje, A. V., Tanner-Smith, E. E., and Adesope, O. (2023). Relationships between risk factors, protective factors and achievement outcomes in academic resilience research: a meta-analytic review. Educ. Res. Rev. 41:100548. doi: 10.1016/j.edurev.2023.100548

Crossref Full Text | Google Scholar

*Irfan Arif, M., and Mirza, M. S. (2017). Effectiveness of an intervention program in fostering academic resilience of students at risk of failure at secondary school level. Bull. Educ. Res. 39, 251–264. Available online at: https://files.eric.ed.gov/fulltext/EJ1210192.pdf

Google Scholar

*Israel, M., Wherfel, Q. M., Shehab, S., Ramos, E. A., Metzger, A., and Reese, G. C. (2016). Assessing collaborative computing: development of the Collaborative-Computing Observation Instrument (C-COI). Comput. Sci. Educ. 26, 208–233. doi: 10.1080/08993408.2016.1231784

Crossref Full Text | Google Scholar

*Jew, C., Green, K. E., and Kroger, J. (1999). Development and validation of a measure of resiliency. Measure. Eval. Counsel. Dev. 32, 75–89. doi: 10.1080/07481756.1999.12068973

Crossref Full Text | Google Scholar

Jimerson, S. R., Sharkey, J. D., Nyborg, V., and Furlong, M. J. (2004). Strength-based assessment and school psychology: a summary and synthesis. Calif. Sch. Psychol. 9, 9–19. doi: 10.1007/BF03340903

Crossref Full Text | Google Scholar

John, O., and De Fruyt, F. (2015). Framework for the Longitudinal Study of Social and Emotional Skills in Cities. Paris: OECD Publishing.

Google Scholar

John, O. P., Naumann, L. P., and Soto, C. J. (2008). “Paradigm shift to the integrative Big Five trait taxonomy: history, measurement, and conceptual issues,” in Handbook of Personality: Theory and Research, 3rd Edn, eds. O. P. John, R. W. Robins, and L. A. Pervin (New York, NY: Guilford), 114–158.

Google Scholar

Jones, S. M., and Kahn, J. (2017). The Evidence Base for How We Learn: Supporting Students' Social, Emotional, and Academic Development. Consensus Statements of Evidence from the Council of Distinguished Scientists. Washington, DC: Aspen Institute.

Google Scholar

Joseph, J. (1994). The Resilient Child: Preparing Today's Youth for Tomorrow's World. New York, NY: Plenum Press.

Google Scholar

Joseph, N. M., Hailu, M., and Boston, D. (2017). Black women's and girls' persistence in the P-20 mathematics pipeline: two decades of children, youth, and adult education research. Rev. Res. Educ. 41, 203–227. doi: 10.3102/0091732X16689045

Crossref Full Text | Google Scholar

*Kai, S., Almeda, M. V., Baker, R. S., Heffernan, C., and Heffernan, N. (2018). Decision tree modeling of wheel-spinning and productive persistence in skill builders. J. Educ. Data Mining 10, 36–71. doi: 10.5281/zenodo.3344810

Crossref Full Text | Google Scholar

Kälin, S., and Oeri, N. (2024). Linking persistence and executive functions with later academic achievement. Int. J. Behav. Dev. 48, 442–449. doi: 10.1177/01650254241265596

Crossref Full Text | Google Scholar

*Kapikiran, S. (2012). Validity and reliability of the academic resilience scale in Turkish high school. Education 132, 474–483. Available online at: https://eric.ed.gov/?id=EJ991100

Google Scholar

Kapur, M. (2008). Productive failure. Cogn. Instr. 26, 379–424. doi: 10.1080/07370000802212669

Crossref Full Text | Google Scholar

Kennedy, A. C., and Bennett, L. (2006). Urban adolescent mothers exposed to community, family, and partner violence: is cumulative violence exposure a barrier to school performance and participation? J. Interpersonal Violence 21, 750–773. doi: 10.1177/0886260506287314

Crossref Full Text | Google Scholar

Kooken, J., Welsh, M. E., McCoach, D. B., Johnston-Wilder, S., and Lee, C. (2016). Development and validation of the Mathematical Resilience Scale. Meas. Eval. Counsel. Dev. 49, 217–242. doi: 10.1177/0748175615596782

Crossref Full Text | Google Scholar

Kumpfer, K. L. (1999). “Factors and processes contributing to resilience: the resilience framework,” in Resilience and Development: Positive Life Adaptations, eds. M. D. Glantz and J. L. Johnson (New York, NY: Kluwer), 179—224.

Google Scholar

Ladd, G. W., Kochenderfer, B. J., and Coleman, C. C. (1996). Friendship quality as a predictor of young children's early school adjustment. Child Dev. 67, 1103–1118. doi: 10.2307/1131882

PubMed Abstract | Crossref Full Text | Google Scholar

Lazarus, R. S., and Folkman, S. (1984). Stress, Appraisal, and Coping. Springer.

Google Scholar

Lee, J., and Shute, V. J. (2010). Personal and social-contextual factors in K-12 academic performance: an integrative perspective on student learning. Educ. Psychol. 45, 1–19. doi: 10.1080/00461520.2010.493471

Crossref Full Text | Google Scholar

Lerner, M. J., and Miller, D. T. (1978). Just world research and the attribution process: looking back and ahead. Psychol. Bull. 85, 1030–1051. doi: 10.1037/0033-2909.85.5.1030

Crossref Full Text | Google Scholar

Lewis, T. J., and Sugai, G. (1999). Effective behavior support: a systems approach to proactive schoolwide management. Focus Except. Child. 31, 1–24. doi: 10.17161/fec.v31i6.6767

Crossref Full Text | Google Scholar

*Liebenberg, L., Ungar, M., and Van de Vijver, F. R. R. (2012). Validation of the child and youth resilience measure-28 (CYRM-28) among canadian youth with complex needs. Res. Soc. Work Pract. 22, 219–226. doi: 10.1177/1049731511428619

Crossref Full Text | Google Scholar

Linnenbrink, E. A. (2005). The dilemma of performance-approach goals: the use of multiple goal contexts to promote students' motivation and learning. J. Educ. Psychol. 97, 197–213. doi: 10.1037/0022-0663.97.2.197

Crossref Full Text | Google Scholar

Lipnevich, A., Preckel, F., and Roberts, R. (2017). Psychosocial Skills and School Systems in the 21st Century: Theory, Research, and Practice. Springer International Publishing.

Google Scholar

*Liu, B., and Platow, M. J. (2020). Chinese adolescents' belief in a just world and academic resilience: the mediating role of perceived academic competence. Sch. Psychol. Int. 41, 239–256. doi: 10.1177/0143034320908001

Crossref Full Text | Google Scholar

Lodge, J. M., Kennedy, G., Lockyer, L., Arguel, A., and Pachman, M. (2018). Understanding difficulties and resulting confusion in learning: an integrative review. Front. Educ. 3:49. doi: 10.3389/feduc.2018.00049

Crossref Full Text | Google Scholar

Lopes da Silva, A., Simão, A. M. V., and Sá, I. (2004). A auto-regulação da aprendizagem: estudos teóricos e empíricos. Intermeio Rev. Mestrado Educação 10, 58–74. Available online at: https://trilhasdahistoria.ufms.br/index.php/intm/article/download/2592/1850

Google Scholar

*Lovelace, D. (2022). Does mathematical resilience address mathematics anxiety? A measurement validation study for high school students (Doctoral dissertation). Sacramento State. Available online at: https://hdl.handle.net/20.500.12741/rep:2763 (Accessed December 13, 2023).

Google Scholar

Luby, J. L., Svrakic, D. M., McCallum, K., Przybeck, T. R., and Cloninger, C. R. (1999). The junior temperament and character inventory: preliminary validation of a child self-report measure. Psychol. Rep. 84(3_suppl), 1127–1138. doi: 10.2466/pr0.1999.84.3c.1127

PubMed Abstract | Crossref Full Text | Google Scholar

*Lufi, D., and Cohen, A. (1987). A scale for measuring persistence in children. J. Pers. Assess. 51, 178–185. doi: 10.1207/s15327752jpa5102_2

PubMed Abstract | Crossref Full Text | Google Scholar

Luthar, S. S. (2006). “Resilience in development: a synthesis of research across five decades,” in Developmental Psychopathology: Vol. 3: Risk, Disorder, and Adaptation, 2nd Edn, eds. D. Cicchetti and D. J. Cohen (New York, NY: Wiley), 739–795.

Google Scholar

Luthar, S. S., Cicchetti, D., and Becker, B. (2000). The construct of resilience: a critical evaluation and guidelines for future work. Child Dev. 71, 543–562. doi: 10.1111/1467-8624.00164

PubMed Abstract | Crossref Full Text | Google Scholar

Lynch, M., and Cicchetti, D. (1998). An ecological-transactional analysis of children and contexts: the longitudinal interplay among child maltreatment, community violence, and children's adaptation. Dev. Psychopathol. 10, 235–257. doi: 10.1017/s095457949800159x

Crossref Full Text | Google Scholar

*Mallick, M. K., and Kaur, S. (2016). Academic resilience among senior secondary students: influence of learning environment. Rupkatha J. Interdiscipl. Stud. Human. 8, 20–27. doi: 10.21659/rupkatha.v8n2.03

Crossref Full Text | Google Scholar

Mangels, J. A., Butterfield, B., Lamb, J., Good, C., and Dweck, C. S. (2006). Why do beliefs about intelligence influence learning success? A social cognitive neuroscience model. Soc. Cogn. Affect. Neurosci. 1, 75–86. doi: 10.1093/scan/nsl013

PubMed Abstract | Crossref Full Text | Google Scholar

Martin, A. (2002). Motivation and academic resilience: developing a model for student enhancement. Austral. J. Educ. 46, 34–49. doi: 10.1177/000494410204600104

Crossref Full Text | Google Scholar

Martin, A. (2005). How to Help Your Child Fly Through Life: The 20 Big Issues. Bantam.

Google Scholar

Martin, A. J. (2001). The student motivation scale: a tool for measuring and enhancing motivation. Austral. J. Guid. Counsel. 11, 1–20. doi: 10.1017/S1037291100004301

Crossref Full Text | Google Scholar

Martin, A. J. (2003a). How to Motivate Your Child for School and Beyond. Bantam.

Google Scholar

Martin, A. J. (2003b). The Student Motivation Scale: further testing of an instrument that measures school students' motivation. Austral. J. Educ. 47, 88–106. doi: 10.1177/000494410304700107

Crossref Full Text | Google Scholar

Martin, A. J. (2006). The Motivation and Engagement Scale. Sydney, NSW: Lifelong Achievement Group.

Google Scholar

*Martin, A. J. (2007). Examining a multidimensional model of student motivation and engagement using a construct validation approach. Br. J. Educ. Psychol. 77, 413–440. doi: 10.1348/000709906X118036

PubMed Abstract | Crossref Full Text | Google Scholar

*Martin, A. J. (2009). Motivation and engagement across the academic lifespan: a developmental construct validity study of elementary school, high school, and university/college students. Educ. Psychol. Measure. 69, 794–824. doi: 10.1177/0013164409332214

Crossref Full Text | Google Scholar

*Martin, A. J. (2013). Academic buoyancy and academic resilience: exploring ‘everyday' and ‘classic' resilience in the face of academic adversity. Sch. Psychol. Int. 34, 488–500. doi: 10.1177/0143034312472759

Crossref Full Text | Google Scholar

Martin, A. J., Anderson, J., Bobis, J., Way, J., and Vellar, R. (2012). Switching on and switching off in mathematics: an ecological study of future intent and disengagement amongst middle school students. J. Educ. Psychol. 104, 1–18. doi: 10.1037/a0025988

Crossref Full Text | Google Scholar

*Martin, A. J., Ginns, P., Burns, E. C., Kennett, R., Munro-Smith, V., Collie, R. J., et al. (2021). Assessing instructional cognitive load in the context of students' psychological challenge and threat orientations: a multi-level latent profile analysis of students and classrooms. Front. Psychol. 12:656994. doi: 10.3389/fpsyg.2021.656994

PubMed Abstract | Crossref Full Text | Google Scholar

*Martin, A. J., and Marsh, H. W. (2003). Academic resilience and the four Cs: confidence, control, composure, and commitment [Paper presentation]. NZARE AARE, Auckland.

Google Scholar

*Martin, A. J., and Marsh, H. W. (2006). Academic resilience and its psychological and educational correlates: a construct validity approach. Psychol. Sch. 43, 267–282. doi: 10.1002/pits.20149

Crossref Full Text | Google Scholar

*Martin, A. J., and Marsh, H. W. (2008). Academic buoyancy: towards an understanding of students' everyday academic resilience. J. Sch. Psychol. 46, 53–83. doi: 10.1016/j.jsp.2007.01.002

PubMed Abstract | Crossref Full Text | Google Scholar

Martin, A. J., and Marsh, H. W. (2008b). Workplace and academic buoyancy: psychometric assessment and construct validity amongst school personnel and students. J. Psychoeduc. Assess. 26, 168–184. doi: 10.1177/0734282907313767

Crossref Full Text | Google Scholar

Martin, A. J., and Marsh, H. W. (2009). Academic resilience and academic buoyancy: multidimensional and hierarchical conceptual framing of causes, correlates, and cognate constructs. Oxford Rev. Educ. 35, 353–370. doi: 10.1080/03054980902934639

Crossref Full Text | Google Scholar

Masten, A. S. (2001). Ordinary magic: resilience processes in development. Am. Psychol. 56, 227–238. doi: 10.1037//0003-066x.56.3.227

PubMed Abstract | Crossref Full Text | Google Scholar

Masten, A. S. (2014). Ordinary Magic: Resilience in Development. Guilford Press.

Google Scholar

McCaugham, L. R., and McKinlay, S. (1981). Effects of success/failure and extrinsic rewards on intrinsic motivation using a competitive motor task. Res. Q. Exerc. Sport 52, 208–215.

PubMed Abstract | Google Scholar

McClelland, D. C. (1961). The Achieving Society. Van Nostrand.

Google Scholar

McClelland, M. M., Tominey, S. L., Schmitt, S. A., and Duncan, R. (2017). SEL interventions in early childhood. Future Child. 27, 33–47. doi: 10.1353/foc.2017.0002

Crossref Full Text | Google Scholar

McCrae, R. R., and Costa, P. T. (1987). Validation of the five-factor model of personality across instruments and observers. J. Pers. Soc. Psychol. 52, 81–90. doi: 10.1037/0022-3514.52.1.81

PubMed Abstract | Crossref Full Text | Google Scholar

McDermott, P. A. (1999). National scales of differential learning behaviors among American children and adolescents. Sch. Psych. Rev. 28, 280–291. doi: 10.1080/02796015.1999.12085965

Crossref Full Text | Google Scholar

McDougall, W. (1908). An Introduction to Social Psychology. London: Methuen.

Google Scholar

McTigue, E. M., Washburn, E. K., and Liew, J. (2009). Academic resilience and reading: building successful readers. Read. Teach. 62, 422–432. doi: 10.1598/RT.62.5.5

Crossref Full Text | Google Scholar

Merrell, K. W. (2008). Social and Emotional Assets and Resiliency Scales (SEARS). Eugene, OR: School psychology program, University of Oregon. Available online at http://strongkids.uoregon.edu/SEARS.html

Google Scholar

Merrell, K. W. (2011). Social and Emotional Assets and Resilience Scales (SEARS). Lutz, FL: Psychological Assessment Resources.

Google Scholar

Merrell, K. W., Cohn, B. P., and Tom, K. M. (2011a). Development and validation of a teacher report measure for assessing social-emotional strengths of children and adolescents. Sch. Psychol. Rev. 40, 226–241. doi: 10.1080/02796015.2011.12087714

Crossref Full Text | Google Scholar

Merrell, K. W., Felver-Gant, J. C., and Tom, K. M. (2011b). Development and validation of a parent report measure for assessing social-emotional competencies in children and adolescents. J. Child Fam. Stud. 20, 529–540. doi: 10.1007/s10826-010-9425-0

Crossref Full Text | Google Scholar

*Miller, R. B., Greene, B. A., Montalvo, G. P., Ravindran, B., and Nichols, J. D. (1996). Engagement in academic work: the role of learning goals, future consequences, pleasing others, and perceived ability. Contemp. Educ. Psychol. 21, 388–422. doi: 10.1006/ceps.1996.0028

PubMed Abstract | Crossref Full Text | Google Scholar

Molnár, G., and Kocsis, Á. (2024). Cognitive and non-cognitive predictors of academic success in higher education: a large-scale longitudinal study. Stud. Higher Educ. 49, 1610–1624. doi: 10.1080/03075079.2023.2271513

Crossref Full Text | Google Scholar

Montalvo, F., and González Torres, M. C. (2004). Self-regulated learning: current and future directions. Electron. J. Res. Educ. Psychol. 2, 1–34. Available online at: https://eric.ed.gov/?id=EJ802298

Google Scholar

Moore, G. R., and Shute, V. J. (2017). “Improving learning through stealth assessment of conscientiousness,” in Handbook on Digital Learning for K-12 Schools, eds. A. Marcus-Quinn and T. Hourigan (Springer), 355–368. doi: 10.1007/978-3-319-33808-8_21

Crossref Full Text | Google Scholar

Morales, E. E. (2014). Learning from success: how original research on academic resilience informs what college faculty can do to increase the retention of low socioeconomic status students. Int. J. High. Educ. 3, 92–102. doi: 10.5430/ijhe.v3n3p92

Crossref Full Text | Google Scholar

Morales, E. E., and Trotman, F. K. (2004). Promoting Academic Resilience in Multicultural America: Factors Affecting Student Success. New York, NY: Peter Lang.

Google Scholar

Morrison, G. M., Brown, M., D'Incau, B., O'Farrell, S. L., and Furlong, M. J. (2006). Understanding resilience in educational trajectories: implications for protective possibilities. Psychol. Sch. 43, 19–31. doi: 10.1002/pits.20126

Crossref Full Text | Google Scholar

Mrazek, P. J., and Mrazek, D. A. (1987). Resilience in child maltreatment victims: a conceptual exploration. Child Abuse Neglect 11, 357–365. doi: 10.1016/0145-2134(87)90009-3

PubMed Abstract | Crossref Full Text | Google Scholar

Muenks, K., Wigfield, A., Yang, J. S., and O'Neal, C. R. (2017). How true is grit? Assessing its relations to high school and college students' personality characteristics, self-regulation, engagement, and achievement. J. Educ. Psychol. 109, 599–620. doi: 10.1037/edu0000153

Crossref Full Text | Google Scholar

*Nese, R. N. T., Doerner, E., Romer, N., Kaye, N. C., Merrell, K. W., and Tom, K. M. (2012). Social emotional assets and resilience scales: development of a strength-based short-form behavior rating scale system. J. Educ. Res. Online 4, 124–139. doi: 10.25656/01:7054

Crossref Full Text | Google Scholar

*Njoki, G. P. (2018). Academic self-concept, motivation and resilience as predictors of mathematics achievement among secondary school students in Nairobi County, Kenya (Unpublished doctoral dissertation). Nairobi: Kenyatta University.

Google Scholar

Noble, T., and McGrath, H. (2015). PROSPER: a new framework for positive education. Psychol. WellBeing 5:2. doi: 10.1186/s13612-015-0030-2

Crossref Full Text | Google Scholar

Noftle, E. E., and Robins, R. W. (2007). Personality predictors of academic outcomes: big five correlates of GPA and SAT scores. J. Pers. Soc. Psychol. 93, 116–130. doi: 10.1037/0022-3514.93.1.116

PubMed Abstract | Crossref Full Text | Google Scholar

*OECD (2021a). Beyond Academic Learning: First Results from the Survey of Social and Emotional Skills. Paris: OECD Publishing.

Google Scholar

*OECD (2021b). Survey of Social and Emotional Skills Technical Report. Paris: OECD Publishing. Available online at: https://www.oecd.org/education/ceri/social-emotional-skills-study/sses-technical-report.pdf (Accessed February 12, 2024).

Google Scholar

Oshio, A., Kaneko, H., Nagamine, S., and Nakaya, M. (2003). Construct validity of the adolescent resilience scale. Psychol. Rep. 93, 1217–1222. doi: 10.2466/pr0.2003.93.3f.1217

PubMed Abstract | Crossref Full Text | Google Scholar

Parker, J. D. A., Saklofske, D. H., Wood, L. M., and Collin, V. T. (2009). “The role of emotional intelligence in education,” in Assessing Emotional Intelligence: Theory, Research, and Applications, eds. C. Stough, D. H. Saklofske, and J. D. A. Parker (New York, NY: Springer), 239–255.

Google Scholar

*Paulino, P., Sá, I., and da Silva, A. L. (2016). Students' motivation to learn in middle school—a self-regulated learning approach. Electronic J. Res. Educ. Psychol. 14, 193–225. doi: 10.25115/ejrep.39.15169

Crossref Full Text | Google Scholar

Payton, J., Weissberg, R. P., Durlak, J. A., Dymnicki, A. B., Taylor, R. D., Schellinger, K. B., et al. (2008). The Positive Impact of Social and Emotional Learning for Kindergarten to Eighth-Grade Students: Findings From Three Scientific Reviews. Collaborative for Academic, Social, and Emotional Learning. Available online at: https://eric.ed.gov/?id=ED505370

Google Scholar

Peak, H. (1955). “Attitude and motivation,” in Nebraska Symposium on Motivation, ed. M. R. Jones (University of Nebraska Press), 149–189.

Google Scholar

Pekrun, R. (2006). The control-value theory of achievement emotions: assumptions, corollaries, and implications for educational research and practice. Educ. Psychol. Rev. 18, 315–341. doi: 10.1007/s10648-006-9029-9

Crossref Full Text | Google Scholar

Pellegrino, J. W., and Hilton, M. L., (2012). Education for Life and Work: Developing Transferable Knowledge and Skills in the 21st Century. Washington, DC: The National Academies Press.

Google Scholar

Perry, S. J., Hunter, E. M., Witt, L. A., and Harris, K. J. (2010). P=f (conscientiousness × ability): examining the facets of conscientiousness. Hum. Perform. 23, 343–360. doi: 10.1080/08959285.2010.501045

Crossref Full Text | Google Scholar

*Phan, H. P. (2016). Interrelations that foster learning: an investigation of two correlation studies. Int. J. Psychol. 51, 185–195. doi: 10.1002/ijop.12127

Crossref Full Text | Google Scholar

Pintrich, P. R. (2000). “The role of goal orientation in self-regulated learning,” in Handbook of Self-Regulation, eds. M. Boekaerts, P. R. Pintrich, and M. Zeidner (Academic Press), 451–502. doi: 10.1016/B978-012109890-2/50043-3

Crossref Full Text | Google Scholar

Pintrich, P. R. (2003). A motivational science perspective on the role of student motivation in learning and teaching contexts. J. Educ. Psychol. 95, 667–686. doi: 10.1037/0022-0663.95.4.667

Crossref Full Text | Google Scholar

Pintrich, P. R., and De Groot, E. V. (1990). Motivational and self-regulated learning components of classroom academic performance. J. Educ. Psychol. 82, 33–40. doi: 10.1037/0022-0663.82.1.33

Crossref Full Text | Google Scholar

Pintrich, P. R., Smith, D. A. F., Garcia, T., and McKeachie, W. J. (1991). A Manual for the Use of the Motivated Strategies for Learning Questionnaire (MSLQ). Technical Report No. 91-B-004. The Regents of the University of Michigan. Available online at: https://files.eric.ed.gov/fulltext/ED338122.pdf (Accessed February 23, 2024).

Google Scholar

Poropat, A. E. (2009). A meta-analysis of the five-factor model of personality and academic performance. Psychol. Bull. 135, 322–338. doi: 10.1037/a0014996

PubMed Abstract | Crossref Full Text | Google Scholar

*Porter, T., Catalán Molina, D., Blackwell, L., Roberts, S., Quirk, A., Duckworth, A. L., et al. (2020). Measuring mastery behaviors at scale: the persistence, effort, resilience and challenge-seeking task (PERC). J. Learn. Anal. 7, 5–18. doi: 10.18608/jla.2020.71.2

Crossref Full Text | Google Scholar

*Rahimi, S., Shute, V., and Qian, Z. (2021). The effects of game and student characteristics on persistence in educational games: a hierarchical linear modeling approach. Int. J. Technol. Educ. Sci. 5, 141–165. doi: 10.46328/ijtes.118

Crossref Full Text | Google Scholar

*Rajan, S. K., Harifa, P. R., and Pienyu, R. (2017). Academic resilience, locus of control, academic engagement and self-efficacy among the school children. Indian J. Positive Psychol. 8, 507–511. Available online at: https://iahrw.org/product/academic-resilience-locus-of-control-academic-engagement-and-self-efficacy-among-the-school-children/

Google Scholar

*Ramezanpour, A., Kouroshnia, M., Mehryar, A., and Javidi, H. (2019). Psychometric evaluation of the academic resilience scale (ARS-30) in Iran. Iran. Evol. Educ. Psychol. J. 1, 144–150. doi: 10.29252/ieepj.1.3.144

Crossref Full Text | Google Scholar

Reimers, N., and Gurevych, I. (2019). Sentence-BERT: sentence embeddings using siamese BERT-networks. arXiv. Available online at: http://arxiv.org/abs/1908.10084 (Accessed July 15, 2025).

Google Scholar

Reis, S. M., Colbert, R. D., and Hébert, T. P. (2004). Understanding resilience in diverse, talented students in an urban high school. Roeper Rev. 27, 110–120. doi: 10.1080/02783190509554299

Crossref Full Text | Google Scholar

Reivich, K., and Shatté, A. (2003). The Resilience Factor: 7 Keys to Finding Your Inner Strength and Overcoming Life's Hurdles. New York, NY: Broadway Books.

Google Scholar

Richardson, M., Abraham, C., and Bond, R. (2012). Psychological correlates of university students' academic performance: a systematic review and meta-analysis. Psychol. Bull. 138, 353–387. doi: 10.1037/a0026838

PubMed Abstract | Crossref Full Text | Google Scholar

Richey, J. E., Andres-Bray, J. M. L., Mogessie, M., Scruggs, R., Andres, J. M. A. L., Star, J. R., et al. (2019). More confusion and frustration, better learning: the impact of erroneous examples. Comput. Educ. 139, 173–190. doi: 10.1016/j.compedu.2019.05.012

Crossref Full Text | Google Scholar

*Ricketts, S. (2015). Academic resilience in mathematics (Publication No. 3736854) (Doctoral dissertation). Atlanta, GA: Emory University. ProQuest Dissertations and Theses Global.

Google Scholar

*Ricketts, S. N., Engelhard, Jr., and Chang, M. L. (2017). Development and validation of a scale to measure academic resilience in mathematics. Euro. J. Psychol. Assess. 33, 79–86. doi: 10.1027/1015-5759/a000274

Crossref Full Text | Google Scholar

*Rikoon, S. H., McDermott, P. A., and Fantuzzo, J. W. (2012). Approaches to learning among head start alumni: structure and validity of the Learning Behaviors Scale. Sch. Psych. Rev. 41, 272–294. doi: 10.1080/02796015.2012.12087509

Crossref Full Text | Google Scholar

Roberts, B. W., Chernyshenko, O. S., Stark, S., and Goldberg, L. R. (2005). The structure of conscientiousness: an empirical investigation based on seven major personality questionnaires. Pers. Psychol. 58, 103–139. doi: 10.1111/j.1744-6570.2005.00301.x

Crossref Full Text | Google Scholar

Rojas, J. P., Reser, J. A., Usher, E. L., and Toland, M. D. (2012). Psychometric Properties of the Academic Grit Scale. Lexington, KY: University of Kentucky. Available online at: http://sites.education.uky.edu/motivation/files/2013/08/PojasPeserTolandUsher.pdf

Google Scholar

Rudd, G., Meissel, K., and Meyer, F. (2021). Measuring academic resilience in quantitative research. A systematic review of the literature. Educ. Res. Rev. 34:100402. doi: 10.1016/j.edurev.2021.100402

Crossref Full Text | Google Scholar

Rutter, M. (1984). Resilient children. Why some disadvantaged children overcome their environments, and how we can help. Psychol. Today, 57-65.

Google Scholar

Rutter, M. (2000). “Resilience reconsidered: conceptual considerations, empirical findings, and policy implications,” in Handbook of Early Intervention, 2nd Edn, eds. J. P. Shonkoff and S. J.Meisels (New York, NY: Cambridge University Press), 651–681.

Google Scholar

Ryan, R. M., and Deci, E. L. (2000). Self-determination theory and the facilitation of intrinsic motivation, social development, and well-being. Am. Psychol. 55, 68–78. doi: 10.1037/0003-066X.55.1.68

PubMed Abstract | Crossref Full Text | Google Scholar

Ryans, D. G. (1938). The meaning of persistence. J. Gen. Psychol. 19, 79–96.

Google Scholar

*Sari, P. R.. (2022). The influence of individual internal protective factors on student's academic resilience. J. Positive Sch. Psychol. 6, 1236–1255. Available online at: https://journalppw.com/index.php/jpsp/article/view/12415/8042

Google Scholar

*Sarwar, M., Inamullah, H., Khan, N., and Anwar, N. (2010). Resilience and academic achievement of male and female secondary level students in Pakistan. J. Coll. Teach. Learn. 7, 19–24. doi: 10.19030/tlc.v7i8.140

Crossref Full Text | Google Scholar

Savickas, M. L., Nota, L., Rossier, J., Dauwalder, J. P., Duarte, M. E., Guichard, J., et al. (2009). Life designing: a paradigm for career construction in the 21st century. J. Vocat. Behav. 75, 239–250. doi: 10.1016/j.jvb.2009.04.004

Crossref Full Text | Google Scholar

Schaufeli, W. B., Salanova, M., González-Romá, V., and Bakker, A. B. (2002b). The measurement of engagement and burnout: a two sample confirmatory factor analytic approach. J. Happiness Stud. 3, 71–92. doi: 10.1023/A:1015630930326

Crossref Full Text | Google Scholar

Schaufeli, W. B., Martinez, I. M., Pinto, A. M., Salanova, M., and Bakker, A. B. (2002a). Burnout and engagement in university students: a cross-national study. J. Cross Cult. Psychol. 33, 464–481.

Google Scholar

Schunk, D. H. (1991). Self-efficacy and academic motivation. Educ. Psychol. 26, 207–231. doi: 10.1207/s15326985ep2603&4_2

Crossref Full Text | Google Scholar

Schunk, D. H., and Pajares, F. (2002). “The development of academic self-efficacy,” in Development of Achievement Motivation, eds. A. Wigfield and J. S. Eccles (Academic Press), 15–31. doi: 10.1016/B978-012750053-9/50003-6

Crossref Full Text | Google Scholar

Schunk, D. H., Pintrich, P. R., and Meece, J. L. (2002). Motivation in Education: Theory, Research, and Applications, 2nd Edn. Pearson.

Google Scholar

Schunk, D. H., and Zimmerman, B. J., (2008). Motivation and Self-Regulated Learning: Theory, Research, and Applications. New York, NY: Lawrence Erlbaum Associates.

Google Scholar

Schwarzer, R. (1998). Optimism, goals, and threats: how to conceptualize self-regulatory processes in the adoption and maintenance of health behaviors. Psychol. Health 13, 759–766. doi: 10.1080/08870449808407430

Crossref Full Text | Google Scholar

*Sewell, M. N., Yoon, H. J., Lechner, C. M., Napolitano, C. M., Rammstedt, B., Roberts, B. W., et al. (2024). Assessing social, emotional, and behavioral skills in just a few minutes: 96-, 45-, and 20-item short forms of the BESSI. Assessment 32, 501–520. doi: 10.1177/10731911241256434

PubMed Abstract | Crossref Full Text | Google Scholar

*Skinner, E., Pitzer, J., and Steele, J. (2013). Coping as part of motivational resilience in school: a multidimensional measure of families, allocations, and profiles of academic coping. Educ. Psychol. Meas. 20, 1–33. doi: 10.1177/0013164413485241

Crossref Full Text | Google Scholar

Skinner, E. A., Kindermann, T. A., Vollet, J. W., and Rickert, N. P. (2022). Complex social ecologies and the development of academic motivation. Educ. Psychol. Rev. 34, 2129–2165. doi: 10.1007/s10648-022-09714-0

Crossref Full Text | Google Scholar

Skinner, E. A., and Pitzer, J. R. (2012). “Developmental dynamics of student engagement, coping, and everyday resilience,” in Handbook of Research on Student Engagement, eds. S. Christenson, A. Reschly, and C. Wylie (Boston, MA: Springer), 21–44. doi: 10.1007/978-1-4614-2018-7_2

Crossref Full Text | Google Scholar

Skinner, E. A., and Wellborn, J. G. (1997). “Children's coping in the academic domain,” in Handbook of Children's Coping With Common Stressors: Linking Theory and Intervention, eds. S. A. Wolchik and I. N. Sandler (New York, NY: Plenum), 387–422.

Google Scholar

Soto, C. J., Napolitano, C. M., and Roberts, B. W. (2021). Taking skills seriously: toward an integrative model and agenda for social, emotional, and behavioral skills. Curr. Dir. Psychol. Sci. 30, 26–33. doi: 10.1177/0963721420978613

Crossref Full Text | Google Scholar

*Soto, C. J., Napolitano, C. M., Sewell, M. N., Yoon, H. J., and Roberts, B. W. (2022). An integrative framework for conceptualizing and assessing social, emotional, and behavioral skills: the BESSI. J. Pers. Soc. Psychol. 123, 192–222. doi: 10.1037/pspp0000401

PubMed Abstract | Crossref Full Text | Google Scholar

Stevens, A., Garritty, C., Hersi, M., and Moher, D. (2018). Developing PRISMA-RR, a reporting guideline for rapid reviews of primary studies (Protocol). Equator Netw. Available online at: http://www.equator-network.org/wp-content/uploads/2018/02/PRISMA-RR-protocol.pdf

Google Scholar

Sulea, C., Virga, D., Maricutoiu, L. P., Schaufeli, W., Dumitru, C. Z., and Sava, F. A. (2012). Work engagement as mediator between job characteristics and positive and negative extra-role behaviors. Career Dev. Int. 17, 188–207. doi: 10.1108/13620431211241054

Crossref Full Text | Google Scholar

*Sun, J., and Stewart, D. (2007). Development of population-based resilience measures in the primary school setting. Health Educ. 7, 575–599. doi: 10.1108/09654280710827957

Crossref Full Text | Google Scholar

*Sunny, C. E. (2018). Stakeholders' conceptualization of students' attitudes and persistence towards STEM: a mixed methods instrument development and validation study (Doctoral dissertation). Cincinnati, OH: University of Cincinnati.

Google Scholar

*Susanto, H., Wedyaswari, M., and Dalimunthe, K. L. (2023). A content validity and cognitive interview to develop the HARMONI items: instrument measuring student well-being in West Java, Indonesia. Front. Educ. 8:1132031. doi: 10.3389/feduc.2023.1132031

Crossref Full Text | Google Scholar

Sweller, J. (2012). “Human cognitive architecture: why some instructional procedures work and others do not,” in APA Educational Psychology Handbook, Vol. 1: Theories, Constructs, and Critical Issues, eds. K. R. Harris et al. (American Psychological Association), 295–325. doi: 10.1037/13273-011

Crossref Full Text | Google Scholar

Sweller, J., Ayres, P., and Kalyuga, S. (2011). Cognitive Load Theory. Springer. doi: 10.1007/978-1-4419-8126-4

Crossref Full Text | Google Scholar

*Tan, L., Sun, X., and Khoo, S. T. (2014). “Can engagement be compared? Measuring academic engagement for comparison,” in International Conference on Educational Data Mining, 213–216. Available online at: http://educationaldatamining.org/EDM2014/uploads/procs2014/short~papers/213_EDM-2014-Short.pdf (Accessed January 26, 2024).

Google Scholar

*Thornton, B., Collins, M., and Daugherty, R. (2006). A study of resilience of American Indian high school students. J. Am. Indian Educ. 45, 4–16. Available online at: https://jaie.asu.edu/sites/default/files/451_2006_1_thomton_et_al.pdf

Google Scholar

Tough, P. (2016). How kids learn resilience. Atlantic 317, 56–66. Available online at: https://www.theatlantic.com/magazine/archive/2016/06/how-kids-really-succeed/480744/

Google Scholar

Tudor, K. E., and Spray, C. M. (2018). Approaches to measuring academic resilience: a systematic review. Int. J. Res. Stud. Educ. 7, 41–61. doi: 10.5861/ijrse.2017.1880

Crossref Full Text | Google Scholar

Tugade, M. M., and Fredrickson, B. L. (2004). Resilient individuals use positive emotions to bounce back from negative emotional experiences. J. Pers. Soc. Psychol. 86, 320–333. doi: 10.1037/0022-3514.86.2.320

PubMed Abstract | Crossref Full Text | Google Scholar

Ungar, M. (2008). Resilience across cultures. Br. J. Soc. Work 38, 218–235. doi: 10.1093/bjsw/bcl343

Crossref Full Text | Google Scholar

Ungar, M., Ghazinour, M., and Richter, J. (2013). Annual research review: what is resilience within the social ecology of human development? J. Child Psychol. Psychiatry 54, 348–366. doi: 10.1111/jcpp.12025

Crossref Full Text | Google Scholar

*Ungar, M., and Liebenberg, L. (2011). Assessing resilience across cultures using mixed-methods: construction of the child and youth resilience measure. J. Mix. Methods Res. 5, 126–149. doi: 10.1177/1558689811400607

Crossref Full Text | Google Scholar

*Uslu, N. A. (2023). How do computational thinking self-efficacy and performance differ according to secondary school students' profiles? The role of computational identity, academic resilience, and gender. Educ. Inf. Technol. 28, 6115–6139. doi: 10.1007/s10639-022-11425-6

Crossref Full Text | Google Scholar

Vallerand, R. J. (2015). The Psychology of Passion: A Dualistic Model. Oxford University Press. doi: 10.1093/acprof:oso/9780199777600.001.0001

Crossref Full Text | Google Scholar

Vannest, K. J., Ura, S. K., Lavadia, C., and Zolkoski, S. (2019). Self-report measures of resilience in children and youth. Contemp. Sch. Psychol. 25, 406–415. doi: 10.1007/s40688-019-00252-1

Crossref Full Text | Google Scholar

*Ventura, M., and Shute, V. (2013). The validity of a game-based assessment of persistence. Comput. Hum. Behav. 29, 2568–2572. doi: 10.1016/j.chb.2013.06.033

Crossref Full Text | Google Scholar

Ventura, M., Shute, V., and Zhao, W. (2013). The relationship between video game use and a performance-based measure of persistence. Comput. Educ. 60, 52–58. doi: 10.1016/j.compedu.2012.07.003

Crossref Full Text | Google Scholar

*Victor-Aigboidion, V., Onyishi, C. N., and Ngwoke, D. U. (2018). Predictive power of academic self-efficacy on academic resilience among secondary school students. J. Nigerian Coun. Educ. Psychol. 12, 294–306. Available online at: https://journals.ezenwaohaetorc.org/index.php/NCEP/article/view/1144

Google Scholar

Wagnild, G. M., and Collins, J. A. (2009). Assessing resilience. J. Psychosoc. Nurs. Ment. Health Serv. 47, 28–33. doi: 10.3928/02793695-20091103-01

PubMed Abstract | Crossref Full Text | Google Scholar

Wang, M., Haertel, G., and Walberg, H. (1994). “Educational resilience in inner cities,” in Educational Resilience in Inner-city America: Challenges and Prospects, eds. M. C. Wang and E. W. Gordon (New York, NY: Routledge), 45–72.

Google Scholar

Wang, M.-T., Degol, J. L., and Henry, D. A. (2019). An integrative development-in-sociocultural-context model for children's engagement in learning. Am. Psychol. 74, 1086–1102. doi: 10.1037/amp0000522

PubMed Abstract | Crossref Full Text | Google Scholar

Wang, M. C., Haertel, G. D., and Walberg, H. J. (1997). “Educational resilience in inner-city schools,” in Children and Youth, Vol. 7, eds. M. C. Wang, G. D. Haertel, and H. J. Walberg, (Sage Publications), 119–140. Avaailable online at: http://files.eric.ed.gov/fulltext/ED419856.pdf

Google Scholar

Waxman, H. C., Gray, J. P., and Padron, Y. N. (2003). Review of Research on Educational Resilience. UC Berkeley; Center for Research on Education, Diversity and Excellence. Available online at: https://escholarship.org/uc/item/7x695885

Google Scholar

*Wei, X., Zhuang, M., and Xue, L. (2023). Father presence and resilience of Chinese adolescents in middle school: psychological security and learning failure as mediators. Front. Psychol. 13:1042333. doi: 10.3389/fpsyg.2022.1042333

PubMed Abstract | Crossref Full Text | Google Scholar

Wenger, E. (2000). Communities of practice and social learning systems. Organization 7, 225–246. doi: 10.1177/135050840072002

Crossref Full Text | Google Scholar

Werner, E. E. (1992). The children of Kauai: resiliency and recovery in adolescence and adulthood. J. Adolesc. Health 13, 262–268. doi: 10.1016/1054-139x(92)90157-7

PubMed Abstract | Crossref Full Text | Google Scholar

Werner, E. E. (2000). “Protective factors and individual resilience,” in Handbook of Early Childhood Intervention, 2nd Edn, eds. J. P. Shonkoff and S. J. Meisels (New York, NY: Cambridge University Press), 115–132.

Google Scholar

West, M. R., Buckley, K., Krachman, K. B., and Bookman, N. (2018). Development and implementation of student social-emotional surveys in the CORE districts. J. Appl. Dev. Psychol. 55, 119–129. doi: 10.1016/j.appdev.2017.06.001

Crossref Full Text | Google Scholar

West, M. R., Kraft, M. A., Finn, A. S., Martin, R. E., Duckworth, A. L., Gabrieli, C. F., et al. (2016). Promise and paradox: measuring students' non-cognitive skills and the impact of schooling. Educ. Eval. Policy Anal. 38, 148–170. doi: 10.3102/0162373715597298

Crossref Full Text | Google Scholar

WestEd. (2002). Resilience and Youth Development Module: Aggregated California Data Fall 1999 to Spring 2002. Los Alamitos, CA: Author.

Google Scholar

Windle, G. (2010). What is resilience? A review and concept analysis. Rev. Clin. Gerontol. 21, 1–18. doi: 10.1017/S0959259810000420

Crossref Full Text | Google Scholar

Windle, G., Bennett, K.M., and Noyes, J. (2011). A methodological review of resilience measurement scales. Health Qual. Life Outcomes 9:8. doi: 10.1186/1477-7525-9-8

PubMed Abstract | Crossref Full Text | Google Scholar

*Yang, R. (2014). The role of non-cognitive skills in students' academic performance and life satisfaction: a longitudinal study of resilience (Doctoral dissertation). Philadelphia, PA: University of Pennsylvania.

Google Scholar

Yang, S., and Wang, W. (2022). The role of academic resilience, motivational intensity, and their relationship in EFL learners' academic achievement. Front. Psychol. 12:823537. doi: 10.3389/fpsyg.2021.823537

Crossref Full Text | Google Scholar

Yaure, R. G., Murowchick, E., Schwab, J. E., and Jacobson-McConnell, L. (2021). How grit and resilience predict successful academic performance. J. Access Retention Inclusion Higher Educ. 3, 1–15. Available online at: https://digitalcommons.wcupa.edu/jarihe/vol3/iss1/6/

Google Scholar

Yeager, D. S., and Dweck, C. S. (2012). Mindsets that promote resilience: when students believe that personal characteristics can be developed. Educ. Psychol. 47, 302–314. doi: 10.1080/00461520.2012.722805

Crossref Full Text | Google Scholar

*Yeung, A. S., Mooney, M., Barker, K. L., and Dobia, B. (2009). Does school-wide positive behaviour system improve learning in primary schools? Preliminary findings. New Horizons Educ. 57, 17–32. Available online at: http://files.eric.ed.gov/fulltext/EJ860815.pdf

Google Scholar

Zhang, S., Bergner, Y., DiTrapani, J., and Jeon, M. (2021). Modeling the interaction between resilience and ability in assessments with allowances for multiple attempts. Comput. Human Behav. 122:106847. doi: 10.1016/j.chb.2021.106847

Crossref Full Text | Google Scholar

Zimmerman, B. J. (2002). Becoming a self-regulated learner: an overview. Theor. Into Pract. 41, 64–70. doi: 10.1207/s15430421tip4102_2

Crossref Full Text | Google Scholar

Zimmerman, B. J., and Moylan, A. R. (2009). “Self-regulation: where metacognition and motivation intersect,” in Handbook of Metacognition in Education, eds. D. J. Hacker, J. Dunlosky, and A. C. Graesser (New York, NY: Routledge/Taylor & Francis), 299–315.

Google Scholar

Zimmerman, B. J., and Schunk, D. H. (2001). “Theories of self-regulated learning and academic achievement: an overview and analysis,” in Self-Regulated Learning and Academic Achievement: Theoretical Perspectives, 2nd Edn, eds. B. J. Zimmerman and D. H. Schunk (Mahwah, NJ: Lawrence Erlbaum Associates), 1–37. doi: 10.4324/9781410601032

Crossref Full Text | Google Scholar

*Zulfikar, Z., Hidayah, N., Triyono, T., and Hitipeuw, I. (2020). Development study of academic resilience scale for gifted young scientists education. J. Educ. Gifted Young Scientists 8, 343–358. doi: 10.17478/jegys.664116

Crossref Full Text | Google Scholar

Keywords: persistence, academic resilience, social-emotional learning (SEL), self-report surveys, behavioral measures

Citation: Sparks JR, Lehman B, Gladstone JR, Zhang S, Schroeder NL and Israel M (2025) Measuring persistence and academic resilience of K−12 students: systematic review and operational definitions. Front. Educ. 10:1673500. doi: 10.3389/feduc.2025.1673500

Received: 26 July 2025; Accepted: 22 September 2025;
Published: 13 November 2025.

Edited by:

Wei Shin Leong, Ministry of Education, Singapore

Reviewed by:

Hui Yong Tay, Nanyang Technological University, Singapore
Mona Gamal Mohamed, RAK Medical and Health Sciences University, United Arab Emirates
Moch. Syihabudin Nuha, State University of Malang, Indonesia

Copyright © 2025 Sparks, Lehman, Gladstone, Zhang, Schroeder and Israel. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Jesse R. Sparks, anNwYXJrc0BldHMub3Jn

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.