Skip to main content

ORIGINAL RESEARCH article

Front. Psychol., 01 July 2021
Sec. Educational Psychology
This article is part of the Research Topic Recent Approaches for Assessing Cognitive Load from a Validity Perspective View all 13 articles

Assessing Instructional Cognitive Load in the Context of Students' Psychological Challenge and Threat Orientations: A Multi-Level Latent Profile Analysis of Students and Classrooms

  • 1University of New South Wales, Kensington, NSW, Australia
  • 2The University of Sydney, Sydney, NSW, Australia
  • 3Macquarie University, Sydney, NSW, Australia
  • 4The King's School, North Parramatta, NSW, Australia

To better understand instructional cognitive load, it is important to operationalize and assess it in novel ways that can reveal how different students perceive and experience this load as either challenging or threatening. The present study administered a recently developed instruction assessment tool—the Load Reduction Instruction Scale-Short (LRIS-S)—to N = 2,071 students in 188 high school science classrooms. Multilevel latent profile analysis (LPA) was used to identify student and classroom profiles based on students' reports of instructional cognitive load (load reduction instruction, LRI; using the LRIS-S) and their accompanying psychological challenge orientations (self-efficacy and growth goals), and psychological threat orientations (anxiety and failure avoidance goals). In phase 1 of analyses (investigating students; Level 1), we identified 5 instructional-psychological student profiles that represented different presentations of instructional load, challenge orientation, and threat orientation, ranging from the most maladaptive profile (the Instructionally-Overburdened & Psychologically-Resigned profile) to the most adaptive profile (Instructionally-Optimized & Psychologically-Self-Assured profile). The derived profiles revealed that similar levels of perceived instructional load can be accompanied by different levels of perceived challenge and threat. For example, we identified two profiles that were both instructionally-supported but who varied in their accompanying psychological orientations. Findings also identified profiles where students were dually motivated by both challenge and threat. In turn, these profiles (and their component scores) were validated through their significant associations with persistence, disengagement, and achievement. In phase 2 of analyses (investigating students and classrooms; Levels 1 and 2), we identified 3 instructional-psychological classroom profiles that varied in instructional cognitive load, challenge orientations, and threat orientations: Striving classrooms, Thriving classrooms, and Struggling classrooms. These three classroom profiles (and their component scores) were also validated through their significant associations with classroom-average persistence, disengagement, and achievement—with Struggling classrooms reflecting the most maladaptive outcomes and Thriving classrooms reflecting the most adaptive outcomes. Taken together, findings show that considering instructional cognitive load (and new approaches to empirically assessing it) in the context of students' accompanying psychological orientations can reveal unique insights about students' learning experiences and about important differences between classrooms in terms of the instructional load that is present.

Introduction

In this study, we propose that instructional cognitive load is likely to be perceived and experienced in different ways by different students. Following cognitive appraisal theory (e.g., Lazarus and Folkman, 1984; for reviews, see Roseman and Smith, 2001; Moors et al., 2013), we suggest that some students will perceive cognitive load in an approach- and challenge-oriented way, while other students will perceive cognitive load in an avoidant- and threat-oriented way. This being the case, we suggest that we can better understand instructional cognitive load when we consider it in the context of students' accompanying psychological challenge and threat orientations. The present study does so from a person-centered perspective (using latent profile analysis; LPA) based on students' reports of instructional load (load reduction instruction, LRI; using the Load Reduction Instruction Scale-Short, LRIS-S) and their accompanying psychological challenge orientations (self-efficacy and growth goals) and psychological threat orientations (anxiety and failure avoidance goals).

In addressing these issues, we adopt a construct validation approach. Researchers in educational psychology have long emphasized the importance of evaluating instruments within a construct validation framework (e.g., Marsh, 2002; Martin, 2007, 2009). Construct validation can be classified in terms of within-network and between-network approaches. Within-network approaches explore the internal structure of a construct or network and between-network approaches typically seek to establish a logical, theoretically meaningful pattern of relations between constructs and networks. Both approaches tend to employ variable-centered analyses such as reliability and factor analysis (for within-network validity) and correlation and regression (for between-network validity). Indeed, this study's central measurement tool (the Load Reduction Instruction Scale; LRIS) has been validated using these variable-centered within- and between-network approaches (Martin and Evans, 2018).

However, as discussed below, variable-centered approaches to construct validation can mask important phenomena among subpopulations of the wider population. Person-centered approaches, on the other hand, are well-placed for validation at a more granular subpopulation level. Therefore, the present investigation applies a person-centered approach to the assessment of within- and between-network validity. Using multilevel LPA and integrating and synthesizing cognitive load theory (Sweller, 2012) with cognitive appraisal theory (Lazarus and Folkman, 1984), we test within-network validity by identifying a network of theoretically plausible student and classroom profiles that are based on student reports of instructional load (via the LRIS) and students' accompanying psychological challenge and threat orientations. We then test between-network validity by exploring the links between the network of student and classroom profiles and a theoretically plausible network of outcome variables (persistence, disengagement, achievement). In essence, then, the present study contributes a novel multilevel construct validity approach to person-centered analysis in a bid to understand the nomological network of instructional cognitive load. Figure 1 shows the hypothesized model.

FIGURE 1
www.frontiersin.org

Figure 1. Hypothesized models tested in the study at the student- and classroom-level.

This approach to construct validity draws on guidelines advanced by methodologists in the measurement space—very much inspired by the seminal work of Campbell and Fiske (1959). Psychological research (including educational psychology research) typically involves hypothetical constructs that are unobservable, conceptual, or theoretical abstractions (Marsh et al., 2006). Constructs are often inferred indirectly via observable indicators. A vital question, then, is how well these indicators represent the hypothetical construct, including the extent to which the construct is: well-represented by derived scores, is well-defined, and is related to variables to which it should be theoretically connected (Marsh, 2002; Marsh et al., 2006). In light of these critical questions, it is recommended that construct validity research comprises multiple perspectives based on multiple methods. According to Marsh et al. (2006), this involves the use of “multiple indicators of each construct, multiple constructs and tests of their a priori relations, multiple outcome measures, multiple independent/manipulated variables, multiple methodological approaches … the multiple perspectives provide a foundation for evaluating construct validity based on appropriate patterns of convergence and divergence and for refining measurement instruments, hypotheses, theory, and research agendas” (p. 442). Accordingly, the present construct validation study includes latent factors (comprised of multiple indicators), multiple factors and tests of their relations, numerous outcome measures, and integration of two analytical methods by way of LPA and multi-level modeling. In these ways, we claim to provide a unique perspective on multilevel construct validity.

Cognitive Load Theory

Cognitive load theory has identified principles of instruction that are aimed at easing the cognitive burden on students as they learn (Sweller et al., 2011; Sweller, 2012). According to cognitive load theory, there are two kinds of cognitive load that can be imposed during instruction and that impede learning: intrinsic and extraneous load (Sweller et al., 2011). Intrinsic cognitive load is a function of the inherent complexity of instructional activity and material, given the learner's prior knowledge. Teachers can manage intrinsic cognitive load by presenting instructional material that is appropriate to the level of knowledge of students (Sweller et al., 2011). Extraneous cognitive load emanates from the way that instructional material is structured and presented (Sweller et al., 2011). Teachers can manage this latter form of load by presenting instructional material sequentially, clearly, and explicitly to students; in doing so, students are guided through learning in a structured and linear fashion, leading to low extraneous load. However, extraneous load is high when instructional material is presented such that students need to figure out the informational structure, have to decide between a range of potential solutions, and/or apply information about which they have low prior knowledge (Sweller et al., 2011). Extraneous cognitive load is identified as an unnecessary burden on students as it does not contribute to learning (Sweller et al., 2011).

To date, the bulk of research into cognitive load theory has been experimental, with cognitive load induced through presentation and manipulation of instructional/learning material to elicit conditions of low or high cognitive load (see Sweller, 2012 for a review). There is significant research assessing cognitive load through self-reports of cognitive burden (e.g., Leppink et al., 2013; Krell, 2017), and other approaches such as through electrodermal activity, neurological activity, eye tracking, blood flow, physical pressure exerted on a computer mouse, etc. (e.g., Paas et al., 1994; Ikehara and Crosby, 2005; Wang et al., 2014; Howard et al., 2015; Ghaderyan et al., 2018). This research has assessed the presence of cognitive load, as well as the instructional techniques aimed at managing or reducing cognitive load. Much of this research has taken place under experimental conditions. This experimental work has significant internal validity and has been critical to rigorously measuring cognitive load and identifying some major effects that are now well-established in the cognitive load tradition (e.g., expertise reversal effect, modality effect, split attention effect, etc.). Alongside this experimental work it is also important to extend assessment to attend to other matters of validity—such as conducting classroom-based assessment research that has the potential to inform the ecological validity of cognitive load assessment. As one significant step in this direction, recent work has articulated a multi-factor instructional framework—load reduction instruction (LRI)—aimed at (a) reducing instructional cognitive load on learners, (b) developing a multi-factor survey instrument to assess this instructional framework, and (c) administering and assessing this instrument in the classroom context (Martin, 2016; Martin and Evans, 2018, 2019; Martin et al., 2020a).

Load Reduction Instruction (LRI) and Its Measurement

Harnessing key cognitive load theory principles, Martin (2016; see also Martin and Evans, 2018, 2019) proposed LRI as an instructional means to manage the cognitive burden students can experience as they learn. According to cognitive load theory, there is a need for instruction that accommodates the reality of the limits of working memory and helps students transfer knowledge between working and long-term memory (Paas et al., 2003; Sweller, 2004). A key means by which this can occur is by developing students' fluency and automaticity in skill and knowledge. This frees up working memory resources, reduces cognitive burden, and better enables students to transfer novel information into long-term memory (Rosenshine, 2009). Based on the Martin (2016) LRI framework, fluency and automaticity are developed through its first four principles: (principle #1) reducing the difficulty of instruction in the initial stages of learning, as appropriate to the learner's level of prior knowledge (see also Pollock et al., 2002; Mayer and Moreno, 2010); (principle #2) providing appropriate support and scaffolding to learn the relevant skill and knowledge (see also Renkl and Atkinson, 2010; Renkl, 2014); (principle #3) allowing sufficient opportunity for practice (see also Purdie and Ellis, 2005; Rosenshine, 2009; Nandagopal and Ericsson, 2012); and (principle #4) providing appropriate feedback-feedforward (combination of corrective information and specific improvement-oriented guidance) as needed (see also Shute, 2008; Hattie, 2009; Mayer and Moreno, 2010). Through these four principles, students develop fluency and automaticity and are then well-positioned to apply their skill and knowledge more independently—including through novel tasks, higher order reasoning and thinking, problem solving, and guided discovery (Martin, 2016; Martin and Evans, 2019)—which is important to guard against potential expertise reversal effects (Kalyuga et al., 2012; Chen et al., 2017). This represents principle #5: guided independent learning.

Having articulated the five key principles of LRI (Martin, 2016), Martin and Evans (2018) developed a novel tool, the Load Reduction Instruction Scale (LRIS), to administer to students to report on the instructional practices of their teacher. The LRIS comprises five factors to assess the five LRI principles (difficulty reduction, support and scaffolding, practice, feedback-feedforward, guided independence). Each factor is composed of five items (yielding a 25-item instrument). In the first empirical study of the LRIS, students in 40 high school mathematics classrooms completed the instrument (Martin and Evans, 2018). Findings revealed the scores of each factor to be reliable, the factor structure to be sound, and significant bivariate correlations with intrinsic and extraneous cognitive load in predicted directions. In a follow-up investigation that linked the Martin and Evans (2018) data with a previous survey, results showed that LRI was associated with gains in mathematics motivation, engagement, and achievement (Evans and Martin, Submitted). In another study, Martin et al. (2020a) introduced and explored a brief form of the LRIS (the LRIS-Short; LRIS-S) that was designed to capture a unidimensional latent factor in keeping with the higher order LRIS factor in the Martin and Evans (2018) research. Martin et al. (2020a) used this LRIS-S in a multilevel study of more than 180 science classrooms, finding that the link between LRI in science and science achievement (at student- and classroom-levels) was mediated by class participation, future aspirations, and enjoyment in science. However, all these studies (including their construct validity aspects) have been variable-centered. As is now discussed, to even better understand the nomological network of LRI, a construct validity approach from a multilevel person-centered perspective has much to offer.

Person-Centered and Multilevel Assessment

Person-Centered Assessment

As noted above, the bulk of research assessing cognitive load in students' academic lives (including LRI research) has been variable-centered. Variable-centered approaches aim to assess relations between factors across individuals (Collie et al., 2020). This has been important for conducting classic construct validity work with cognitive load factors and for identifying what cognitive load factors predict or are predicted by other factors (e.g., Leppink et al., 2013; Klepsch and Seufert, 2020). However, variable-centered approaches can mask important phenomena among subpopulations of the wider population. In contrast, person-centered research utilizes the factors in a study to identify distinct subpopulations (or profiles) of individuals (Bauer and Curran, 2004; Collie et al., 2015; Morin et al., 2017). For example, different subpopulations of students may reflect different patterns in how LRI manifests and relates to other educational factors in their classroom and academic experience. Thus, LRI may not function in a similar way for all students. To the extent this is the case, it is important to assess LRI to capture distinct subpopulations of students, if such subpopulations exist. This will generate practical yields in identifying specific student profiles that teachers can attend to in their pedagogy.

Person-centered analyses (in this investigation, latent profile analysis—LPA) are ideal for addressing these issues. Specifically, by revealing the way LRI co-occurs with other factors among subpopulations of students, person-centered assessment helps identify the different types of students that reside within the classroom and the distinct ways LRI manifests in these students' academic lives. As we describe below, we seek to further assess the construct validity of LRI from a person-centered perspective by including psychological challenge and threat orientation measures alongside LRI measures to better understand the nomological network of instructional-psychological profiles. LPA enables us to see whether there might be subpopulations of students with similar LRI levels, but who differ on psychological factors (and vice versa). As we describe below, it is theoretically plausible that students who have different psychological challenge and threat orientations will experience instructional load in different ways and our person-centered construct validity approach seeks to confirm this. The LPA represents the within-network validity aspect of our study by way of its assessment of a target network of instructional-psychological profiles.

Multilevel Assessment

It is also the case that the bulk of cognitive load research is almost exclusively conducted at the student-level, without appropriate regard for the classrooms to which the students belong (however, LRI research is an exception—described below). There is now broad recognition of the importance of analyzing hierarchical data in appropriate ways (Marsh et al., 2012), especially when the variables of interest include references to class-wide phenomena, such as instruction (as LRI does). In single-level research designs there are statistical biases (e.g., within-group dependencies; confounding of variables within and between groups) and multilevel analyses seek to resolve biases like these (see Raudenbush and Bryk, 2002; Goldstein, 2003; Marsh et al., 2009). Multilevel analysts have identified the reciprocity of group and individual dynamics: the group can affect the individuals and individuals can affect the group (Raudenbush and Bryk, 2002; Goldstein, 2003; Marsh et al., 2009). To date, most LRI research has recognized this and conducted multilevel analyses at student- and classroom-levels—indeed, demonstrating the validity of LRI at student- and classroom-levels (Martin and Evans, 2018, Evans and Martin, Submitted; Martin et al., 2020a). But, as noted, these studies have involved variable-centered analyses, which may mask differences between subpopulations of students and classrooms. The present study extends this research by conducting multilevel LPA that not only identifies student instructional-psychological profiles, but also classroom instructional-psychological profiles. Given LRI is about classroom instruction, its construct validity must be reflected in theoretically logical classroom profiles.

LRI and Accompanying Psychological Orientations

As described earlier, it is plausible that cognitive load will be perceived and experienced by students in different ways. Theories of cognitive appraisal (e.g., Lazarus and Folkman, 1984; for reviews, see Roseman and Smith, 2001; Moors et al., 2013) suggest that when presented with a task, an individual subjectively appraises its demands and their capacity to meet the demands. The individual perceives challenge when they believe they can meet the demands; the individual perceives threat when they believe they cannot meet the demands (see also Putwain and Symes, 2014, 2016, Uphill et al., 2019). Thus, when cognitive load is unacceptably high there may be a greater likelihood that anxiety and fear are present, and when cognitive load is effectively managed, more positive appraisals occur. At the same time, it is also the case that different students can appraise the same stimuli in different ways. For example, following cognitive appraisal theory (Lazarus and Folkman, 1984), some students will perceive cognitive load in an approach- and challenge-oriented way, while other students will perceive cognitive load in an avoidant- and threat-oriented way. Indeed, recent reviews of challenge and threat suggest that there may even be the dual presence of both challenge and threat (Uphill et al., 2019; see also Rogat and Linnenbrink-Garcia, 2019 for dual goals under approach-avoidance goal frameworks)—for example, in the event of cognitive load there may be students who perceive it as a challenge and opportunity for learning, but who also fear failing and see it as a potential threat. This idea had been previously raised by Covington (2000) and Martin and Marsh (2003) in the form of “overstrivers” who are students investing effort (in a challenge-like way), but driven in large part by a fear of poor performance. This brings into consideration the extent to which there exist student profiles based on different combinations of LRI and psychological challenge and threat orientations. LPA is designed to explore such possibilities and thus represents an important (and novel) means of assessing the study's contended network of instructional-psychological profiles (i.e., the within-network validity aspect of the study).

Consistent with Martin et al. (2021), recent challenge-threat frameworks (e.g., Putwain and Symes, 2014, 2016; Putwain et al., 2015; Uphill et al., 2019), and the latest approach-avoidance perspectives that have introduced growth goals (e.g., Elliot et al., 2011, 2015), we propose psychological challenge orientation can be inferred through students' self-efficacy and growth goals. Self-efficacy refers to a belief in one's capacity to meet or exceed a task demand or task challenge (Bandura, 1997; Schunk and DiBenedetto, 2014). This being the case, self-efficacy has been inferred as an analog of perceived challenge (Uphill et al., 2019) and operationalized as an indicator of perceived challenge in LPA (Martin et al., 2021). Perceived challenge has also been linked to approach orientations in motivational psychology. For example, Elliot defines approach motivation as the “energization of behavior by, or the direction of behavior toward, positive stimuli (objects, events, possibilities)” (2006, p. 112). According to Elliot, goals are a major means by which individuals' approach (and avoidance) orientations are manifested. Recent research has identified growth goals (i.e., self-improvement, or personal best goals) as one example of an approach motivation orientation (Martin and Liem, 2010; Elliot et al., 2011, 2015; Martin and Elliot, 2016a,b; Burns et al., 2018, 2019, 2020b). We recognize mastery goals are also approach-oriented, but it has previously been argued that growth goals represent a particularly ambitious and challenge-oriented goal striving (in keeping with our intent to capture challenge orientation) and are shown to explain variance in engagement beyond the effects of mastery goals (Yu and Martin, 2014; Martin and Elliot, 2016a). Growth goals are thus inferred as having an underlying psychological challenge dimension to them, along the lines of Uphill et al.'s (2019) review.

Also following Martin et al. (2021) and recent challenge-threat and approach-avoidance frameworks (e.g., Elliot et al., 2011, 2015; Putwain and Symes, 2014, 2016; Putwain et al., 2015; Uphill et al., 2019), we propose psychological threat orientation can be inferred through students' anxiety and failure avoidance goals. Anxiety reflects an individual's perception that a task demand exceeds their personal resources and capacity and poses a self-relevant threat to them in some way (Britton et al., 2011). Anxiety has thus been associated with threat appraisals (e.g., see Putwain et al., 2015, 2017; see also Uphill et al., 2019) and has been used as an indicator of perceived threat in LPA (Martin et al., 2021). Perceived threat has also been linked to avoidance orientations in motivational psychology. Elliot defines avoidance motivation as the “energization of behavior by, or the direction of behavior away from, negative stimuli (objects, events, possibilities)” (2006, p. 112). In goal theory research, avoidance goals are a salient example of avoidance motivation (Elliot, 2006; Van Yperen et al., 2015). Avoidance goals are those where the individual strives to avoid poor performance and failure or the implications of poor performance (Covington, 2000; Elliot, 2006; Martin, 2007, 2009), and being an element of an avoidance orientation, are suggested as analogs of an inherent psychological threat orientation, along the lines of Uphill et al.'s (2019) review.

To summarize, the present study assesses students' self-efficacy and growth goals (to infer challenge) and anxiety and failure avoidance goals (to infer threat) as key psychological orientations to include as indicators alongside LRI in LPA. Importantly, however, although we conceptually categorize these as challenge and threat indicators, they are modeled as separate indicators so that these conceptual groupings can be tested empirically. We also point out that this study is a multilevel one that assesses these issues at the classroom-level. It is feasible that psychological orientations of challenge and threat manifest at the classroom-level such that some classrooms are relatively higher or lower in these orientations. Indeed, motivation research has suggested that classrooms can vary in the extent to which they are challenge- and approach-oriented vs. threat- and avoidance-oriented (Lau and Nie, 2008). Furthermore, these classroom-level psychological orientations may co-vary with different levels of LRI in different ways. This being the case, the present study assesses classroom-level psychological orientations alongside classroom-level LRI to identify classroom-level instructional-psychological profiles.

Assessing Links Between Instructional-Psychological Profiles and Academic Outcomes

This research is conducted in the educational domain of science. Science is a challenging subject for many students (Coe et al., 2008) and there are worrying trends in students' science achievement and science participation (especially in “Western” nations). For example, in the Programme for International Student Assessment (PISA), the long-term change in the average science performance of Australia (the site of the present study) demonstrates one of the largest declines among participating nations (OECD., 2020). There are also long-term declines in science participation and enrolments among senior school students (Office of the Chief Scientist, 2014) as well as declining interest in science in high school (Tröbst et al., 2016). Inherent in these trends are three major concerns. First, students do not seem to be orienting toward science in a participatory and persistent way—Martin et al. (2012) referred to this as “switching on.” Second, there are unacceptable numbers of students disengaging from science—Martin et al. (2012) referred to this phenomenon as “switching off.” Third, science achievement is in decline in many nations. This being the case, it is important to: (a) initiate and foster more positive persistence (“switching on”) in science, (b) arrest students' disengagement (“switching off”) in science, and (c) boost students' science achievement. Therefore, we sought to explore the association between the derived network of student- and classroom-level instructional-psychological profiles and a network of student- and classroom-level outcomes in the form of science persistence, disengagement, and achievement. From a construct validation perspective (Marsh, 2002), this represents the between-network validity aspect of our investigation.

In variable-centered research, Martin et al. (2012) found that approach-oriented predictors such as self-efficacy were positively associated with “switching on” in mathematics (positive future intentions and engagement) and negatively associated with “switching off” in mathematics (disengagement). The inverse pattern of associations was found for avoidance-oriented predictors such as anxiety. Also, in variable-centered research, LRI is positively associated with achievement in mathematics (Martin and Evans, 2018). In recent person-centered research, Martin et al. (2021) found that approach (challenge)-oriented students had higher science test scores, while avoidance (threat)-oriented students had lower scores. Interestingly, and in keeping with the potential dual presence of challenge and threat (Uphill et al., 2019), Martin et al. (2021) also identified a third profile reflecting both approach (challenge) and avoidance (threat)—students in this profile scored midway between the former two profiles on the science test. Taking these recent student-level findings together, we tentatively suggest at least three student-level profiles that will be associated with our outcomes (persistence, disengagement, achievement) in a descending adaptive pattern: high approach-low avoidance-high LRI, high approach-high avoidance-high LRI, and low approach-high avoidance-low LRI. From a construct validity perspective, demonstrating associations in a predicted manner is important (Marsh et al., 2006). Regarding classroom-level findings, we believe there is not a sufficient evidence base to guide predictions and so this is a more exploratory aspect of the study.

Aims of the Present Study

We argue that to fully understand instructional cognitive load, it is important to operationalize and assess it in novel ways that can provide unique validity insights into how different students perceive and experience this load. We further suggest it is important to consider these novel assessment approaches using appropriate cutting-edge analytic models. Accordingly, we adopted a within- and between-network construct validity approach and used multilevel LPA to identify instructional-psychological profiles among students and classrooms based on students' reports of instructional cognitive load (via load reduction instruction; LRI) and their accompanying psychological challenge orientations (self-efficacy, growth goals) and psychological threat orientations (anxiety, failure avoidance goals). In phase 1 of analyses, we sought to identify a network of instructional-psychological profiles among students (student-level within-network validity). In phase 1, we also tested the links between the derived profile network and a network of three academic outcomes (persistence, disengagement, achievement) (student-level between-network validity). In phase 2 of analysis, we extended our examination to the classroom-level where we sought to identify the network of classroom profiles based on the relative frequency of student profiles identified in phase 1 (classroom-level within-network validity). We also tested whether the derived network of different classroom profiles was associated with different levels of classroom-average persistence, disengagement, and achievement (classroom-level between-network validity). Figure 1 demonstrates the model and parameters under investigation.

Method

Participants and Procedure

Participants comprised 2,0711 Australian high school students from 188 science classrooms in eight schools. The schools were independent (non-systemic) schools, in a major capital city on the east coast of Australia. Four of these schools were single-sex boys' schools and four were single-sex girls' schools. Just over half (60%) the sample comprised girls. Participants were in Year 7 (29%), Year 8 (22%), Year 9 (24%), and Year 10 (25%). The average age was 14.02 years (SD = 1.27 years). Just under one in ten (8%) students spoke a language other than English at home. Socioeconomic status (SES) varied (range 846–1,181, M = 1,138, SD = 41, on the Australian Bureau of Statistics Index of Relative Socio-Economic Advantage and Disadvantage classification), but in aggregate was a higher SES than the Australian mean of 1,000. On average, classrooms had about eleven students (adequate for group-level effect estimation and not disproportionate to the ratio of staff to students in the independent school sector when accounting for student absences, non-participation, and students who have not received participation consent from their parents; also see Australian Bureau of Statistics, 2019). The lead researcher's university provided human ethics approval for the study. Approval was then provided by school principals for their school to participate. Then, participating students (and their parents/carers) provided consent. The online survey (that also comprised an achievement test) was administered during a regularly scheduled science class in 2018. Students completed the instrument on their own. Teachers were informed they could help students with procedural aspects of the survey, but not provide assistance in answering specific questions. The data in this investigation are shared with Martin et al. (2020a), who conducted a variable-centered study exploring the extent to which class participation, educational aspirations, and enjoyment of school mediated the relationship between LRI and achievement.

Materials

Indicators for the network of instructional-psychological profiles were measured by way of instructional cognitive load (load reduction instruction), psychological challenge orientation (self-efficacy, growth goals), and psychological threat orientation (anxiety, failure avoidance goals). These indicators were the within-network validity variables for this study. The network of outcome measures was assessed by way of persistence, disengagement, and achievement. These were the between-network validity variables for the study.

Instructional Cognitive Load via Load Reduction Instruction Scale—Short (LRIS-S)

As described in Martin et al. (2020a), the LRIS-S was developed to measure student perceptions of their teacher's use of instructional strategies known to reduce extraneous cognitive load (and because of this, some intrinsic cognitive load). In the LRIS-S, the five LRI factors are represented by a single item (the full LRIS has 5 items for each of the 5 factors; Martin and Evans, 2018). As detailed in Martin et al. (2020a), the factors and items (adapted to science) are: difficulty reduction (“When we learn new things in this science class, the teacher makes it easy at first”); support (“In this science class, the teacher is available for help when we need it”); practice (“In this science class, the teacher makes sure we practice important things we learn”); feedback-feedforward (“In this science class, the teacher provides frequent feedback that helps us learn”); and independence (“Once we know what we're doing in this science class, the teacher gives us a chance to work independently”). Responses were provided on a seven-point scale (1 = strongly disagree to 7 = strongly agree). Reliability for this scale was sound (see Table 1) and ICC = 0.16. Because the LRIS has an emphasis on reduction of cognitive load, Martin and Evans (2018) conducted analyses showing that the scale was significantly negatively associated with intrinsic load (load referring to task difficulty and complexity) and extraneous load (load referring to difficulty and complexity of instruction; Chandler and Sweller, 1991). They concluded that the LRIS does assess aspects of instruction impacting distinct elements of cognitive load. Martin et al. (2020a) showed that the reliability of the LRIS-S at student- and classroom-levels was high and their doubly-latent multi-level (student and classroom) factor analysis demonstrated sound fit and yielded strong factor loadings. Thus, we suggest that student-reported LRIS-S offers valid insights into the instructional practices relevant to cognitive load.

TABLE 1
www.frontiersin.org

Table 1. Descriptive statistics.

Psychological Challenge Orientation

Psychological challenge orientation was assessed via self-efficacy and growth goals. Self-efficacy (4 items; e.g., “If I try hard, I believe I can do well in this science class”) was measured via the domain-specific form of the Motivation and Engagement Scale—High School (MES-HS; Martin, 2007, 2009), validated by Green et al. (2007). Growth goals (4 items; e.g., “When I do my science schoolwork I try to do it better than I've done before”) were measured via the domain-specific form of the Personal Best Goal Scale (Martin and Liem, 2010), for which Martin and Elliot (2016a) provided evidence of validity. Responses were provided on a seven-point scale (1 = strongly disagree to 7 = strongly agree). Reliability for the scores of both scales was sound (see Table 1) and ICCs = 0.08 and 0.12 for self-efficacy and growth goals, respectively.

Psychological Threat Orientation

Psychological threat orientation was assessed via anxiety (4 items; e.g., “When exams and assignments are coming up for this science class, I worry a lot”) and failure avoidance goals (4 items; e.g., “Often the main reason I work in this science class is because I don't want people to think that I'm dumb”). Both were from the domain-specific form of the MES-HS (Martin, 2007, 2009), for which Green et al. (2007) provided evidence of validity. Responses were provided on a seven-point scale (1 = strongly disagree to 7 = strongly agree). Reliability for the scores of both scales was sound (see Table 1) and ICCs = 0.08 and 0.05 for anxiety and failure avoidance goals, respectively.

Persistence and Disengagement

In line with Martin et al. (2012), engagement was assessed via the dual dimensions of “switching on” and “switching off.” Switching on was operationalized through persistence (4 items; e.g., “If I can't understand something in this science class at first, I keep going over it until I do”). Switching off was operationalized through disengagement (4 items; e.g., “Each week I'm trying less and less in this science class”). Both were from the domain-specific form of the MES-HS (Martin, 2007, 2009), validated by Green et al. (2007). Responses were provided on a seven-point scale (1 = strongly disagree to 7 = strongly agree). Reliability for scores of both scales was sound (see Table 1) and ICCs = 0.12 and 0.17 for persistence and disengagement, respectively.

Achievement

Achievement was measured using 12 questions in an online test (part of the online survey). Instrument piloting and development are fully described in Martin et al. (2020a). The test aligned with the science syllabus applicable to our sample; therefore, two forms were developed, one based on the Stage 4 (years 7 and 8) state science syllabus and the other based on the Stage 5 (years 9 and 10) state science syllabus. Test questions were set within the contexts of Physical World, Earth and Space, Living World, and Chemical World and addressed the following skills: questioning and predicting, planning investigations, conducting investigations, processing and analyzing data and information, and problem solving. Each question was grounded within one of the abovementioned specific science contexts and there was an ~30/70 ratio of content-focused to skill-focused questions, with the easier questions focusing on content and the harder questions focusing on skill application. All multiple-choice test responses were recoded as dichotomous (0 = incorrect; 1 = correct). The correct answers were summed to a total achievement score (thus, a continuous scale), reflecting something of a formative construct. Achievement scores were then standardized by year level (M = 0; SD = 1). The test was reliable, as shown in Table 1 and ICC = 0.37.

Data Analysis

Analyses were conducted using Mplus 8.60 (Muthén and Muthén, 2017). The robust maximum likelihood (MLR) estimator was used in all models. Missing data were addressed using Full Information Maximum Likelihood (FIML; Arbuckle, 1996). Confirmatory factor analysis (CFA) was run at the student-level (and corrected for nesting within classrooms via the Mplus “COMPLEX” command) using the standardized factor approach to identification to obtain student-level factor scores for the five profile indicators and the three outcomes. The CFA also comprised background attributes as auxiliary variables—reported in analyses in Supplementary Materials. Factor scores were saved and used in the LPAs. The LPA analyses comprised two phases: single-level LPA (phase 1) and multi-level LPA (phase 2).

Single-Level LPA

For the single-level LPA conducted at the student-level (Level 1; L1), we tested a range of solutions involving 1 through 9 profiles. Following Collie et al. (2020), variances and means were free to differ across profiles and indicator variables; models were estimated using at least 10,000 random start values, with 100 iterations and 1,000 final stage optimizations; and we confirmed that the best log-likelihood value was replicated for each model. Numerous indices were used to assess model fit: for the Consistent Akaike Information Criteria (CAIC), Akaike Information Criteria (AIC), Bayesian Information Criteria (BIC), and sample-size-adjusted Bayesian Information Criteria (SSA-BIC), smaller values reflect better fit. We created elbow plots of the CAIC, AIC, BIC, and SSA-BIC indices. In these plots, the profile at which point the slope noticeably flattens is another indicator of an appropriate solution (Morin et al., 2016). The p-value of the adjusted Lo–Mendell–Rubin Likelihood Ratio Test (pLMR) enabled comparison of a k-profile model against a k-1 profile model to determine if the former profile model yielded a better fit relative to the latter profile model. We also provide entropy values in Table 2 as an indicator of classification accuracy. In addition to fit indices and where appropriate, we applied rules of parsimony, conceptual relevance, and statistical adequacy to further ascertain the optimal solution. After identifying the final network of profiles (the student-level within-network validity aspect), we then examined the network of academic outcomes (persistence, disengagement, achievement) as a function of profile membership (the student-level between-network validity aspect), controlling for background attributes. Outcomes were included using the direct approach and compared across profiles using the Mplus “MODEL CONSTRAINT” option, which relies on the multivariate delta method for tests of statistical significance (e.g., Raykov and Marcoulides, 2004). As part of this, the outcomes were also regressed on participants' background attributes, which acted as covariate controls (McLarnon and O'Neill, 2018).

TABLE 2
www.frontiersin.org

Table 2. Single-level LPA fit statistics.

Multi-Level LPA

In phase 2, we extended the student-level (Level 1, L1) findings to determine the extent to which classroom-level (Level 2; L2) profiles could be identified; or, put another way, the extent to which we could identify classroom profiles characterized by distinct combinations of the different student profiles. Thus, phase 2 identified classroom profiles based on the relative frequency of the various L1 latent profiles. To maintain the stability of the previously identified student-level profiles (L1), we used the manual 3-step approach detailed by Litalien et al. (2019; also see Vermunt, 2010; Morin and Litalien, 2017). Multi-level LPA solutions (1–9 classroom-level profiles) were assessed. Following Collie et al. (2020), each model was estimated using at least 10,000 random start values, 100 iterations, and 1,000 final stage optimizations; replication of the best log-likelihood value was sought for each model; and the best model was selected using the same criteria as the single-level LPA (phase 1), except the pLMR, which is not available for multi-level LPA. After identifying the network of classroom-level profiles (the classroom-level within-network validity aspect), we then examined the network of L2 outcomes (classroom-average persistence, disengagement, and achievement) as a function of profile membership by adding classroom-average outcome variables (using the Mplus cluster mean approach) to the final best-fitting model (the classroom-level between-network validity aspect). Outcomes were compared across profiles using the Mplus “MODEL CONSTRAINT” option (e.g., Raykov and Marcoulides, 2004).

Results

Preliminary Analyses

Table 1 shows reliability coefficients and descriptive statistics for the profile indicators and the outcome variables in the study. These data indicate acceptable reliability. The CFA used to generate factor scores yielded an excellent fit to the data, χ2 = 1,321, df = 378, p < 0.001, CFI = 0.964, TLI = 0.958, RMSEA = 0.035. Indeed, these preliminary variable-centered analyses demonstrate sound within-network validity properties (Marsh, 2002). The resulting correlation matrix is presented in Supplementary Table 1. These factor scores were then used in the subsequent LPAs.

Single-Level LPA

The fit statistics for the 1–9-profile solutions are shown in Table 2 and the elbow plot is shown in Supplementary Figure 1. In these it is evident that the CAIC, AIC, BIC, and SSA-BIC decline with each additional profile. There appears to be slight inflection points around 4, 5, and 6 profiles. Although we do not rely on the pLMR, it is interesting to note it supported the 6-profile solution, but it was significant at the p < 0.01 level—whereas the 4–5-profile solutions were significant at p < 0.001. In addition, although not relying on minimum profile size as a decision criterion, we note that the 6-profile solution had a minimum profile size of <2%, whereas the 5-profile solution had a size of 8%. Taken together, we felt that additional profiles were theoretically useful and well-differentiated up to 6 profiles, but the sixth profile presented a shape that was qualitatively similar (even if it differed quantitatively) to that of the 5-profile solution. We therefore proceeded with the 5-profile solution. A graphical representation of this 5-profile solution is presented in Figure 2. Students corresponding to profile 1 (8% of students) reported very low LRI, very low self-efficacy, very low growth goals, neutral anxiety, and neutral failure avoidance goals. This profile was labeled Instructionally-Overburdened & Psychologically-Resigned to reflect very high instructional cognitive load and very low challenge orientation, indeed so much so they are not particularly threatened, but rather resigned. Students corresponding to profile 2 (30% of students) reported low LRI, low self-efficacy, low growth goals, high anxiety, and high failure avoidance goals. This profile was labeled Instructionally-Burdened & Psychologically-Fearful to reflect modest instructional load, low challenge orientation, and high threat orientation. Students corresponding to profile 3 (31% of students) reported above average LRI, above average self-efficacy and growth goals, and below average anxiety and failure avoidance goals. We labeled this profile Instructionally-Supported & Psychologically-Composed to reflect low instructional load, above average challenge orientation, and below average threat orientation. Students corresponding to profile 4 (9% of students) reported very high LRI, very high self-efficacy, very high growth goals, very low anxiety, and very low failure avoidance goals. We labeled this profile Instructionally-Optimized & Psychologically-Self-Assured to reflect the very low cognitive instructional load, the very high challenge orientation, and the very low threat orientation. Students corresponding to profile 5 (22% of students) reported above average scores on each of LRI, self-efficacy, growth goals, anxiety, and failure avoidance goals. We labeled this profile Instructionally-Supported & Psychologically-Pressured to reflect low instructional load as well as dual high challenge and threat orientations.

FIGURE 2
www.frontiersin.org

Figure 2. Single-level LPA results: Instructional-psychological profile names, profile means, and % of sample.

We then assessed for differences between profiles in persistence, disengagement, and achievement (adjusted for background attribute covariates—see Supplementary Tables 2A–E for the predictive relationships between background attributes and the latent profiles). Mean scores are shown in Table 3. For persistence, findings indicated that each profile was significantly different from the other. In ascending order of persistence were: Instructionally-Overburdened & Psychologically-Resigned (lowest persistence; M = −1.571), then Instructionally-Burdened & Psychologically-Fearful (M = −0.375), then Instructionally-Supported & Psychologically-Composed (M = 0.559), then Instructionally-Supported & Psychologically-Pressured (M = 0.857), then Instructionally-Optimized & Psychologically-Self-Assured (highest persistence; M = 1.407).

TABLE 3
www.frontiersin.org

Table 3. Means (SEs and 95% CIs) on dependent variables for each instructional-psychological profile.

For disengagement, findings indicated that with one exception (Instructionally-Supported & Psychologically-Composed = Instructionally-Supported & Psychologically-Pressured), each profile was significantly different from the other. In descending order of disengagement were: Instructionally-Overburdened & Psychologically-Resigned (highest disengagement; M = 1.660), then Instructionally-Burdened & Psychologically-Fearful (M = 0.436), then Instructionally-Supported & Psychologically-Composed and also Instructionally-Supported & Psychologically-Pressured (M = −0.683 and M = −0.743, respectively), then Instructionally-Optimized & Psychologically-Self-Assured (lowest disengagement; M = −1.231).

For achievement, findings indicated that with one exception (Instructionally-Overburdened & Psychologically-Resigned = Instructionally-Burdened & Psychologically-Fearful), each profile was significantly different from the other. In ascending order of achievement were: Instructionally-Overburdened & Psychologically-Resigned and also Instructionally-Burdened & Psychologically-Fearful (lowest achievement; M = −0.590 and M = −0.409, respectively), then Instructionally-Supported & Psychologically-Pressured (M = 0.054), then Instructionally-Supported & Psychologically-Composed (M = 0.089), then Instructionally-Optimized & Psychologically-Self-Assured (highest achievement; M = 0.430).

Multi-Level LPA

The fit statistics for the multi-level LPA solutions are reported in Table 4 (the elbow plot is shown in Supplementary Figure 2). Here 1–9-profile solutions are presented. The 2-profile solution resulted in the consistently lowest value for the fit indices, but there was some further flattening on other indices at the third profile. Also, as described below, the 3-profile solution yielded a group that separated classrooms in qualitatively distinct ways that was beyond what was possible in a 2-profile solution that (as it turned out, and is described below) could not differentiate a Striving profile, from Thriving and Struggling profiles. Moreover, this additional profile constituted a sizeable subpopulation (22%). Morin et al. (2017) emphasize the importance of ensuring that each profile adds conceptually and practically meaningful information to a solution. Thus, while recognizing aspects of fit suggest a 2-profile solution, we concluded there was substantive and practical yield in the additional profile. Accordingly, a solution with 3 classroom-level profiles was selected as the final solution.

TABLE 4
www.frontiersin.org

Table 4. Multi-level LPA fit statistics.

A graphical representation of this final 3-profile solution is presented in Figure 3. Examination of this 3-profile solution suggested the presence of one Struggling classroom profile (22% of the classrooms), one Striving classroom profile (36% of the classrooms), and one Thriving classroom (42% of classrooms). The Struggling classroom had the highest proportion of students from the Instructionally-Overburdened & Psychologically-Resigned (19%) and Instructionally-Burdened & Psychologically-Fearful (49%) profiles. The Striving classroom included a high proportion of students from the Instructionally-Supported & Psychologically-Pressured (31%), Instructionally-Burdened & Psychologically-Fearful (33%), and Instructionally-Overburdened & Psychologically-Resigned (10%) profiles. The Thriving classroom had the highest proportion of students from the Instructionally-Optimized & Psychologically-Self-Assured (29%) profile, along with a high proportion of students from the Instructionally-Supported & Psychologically-Composed (14%) and Instructionally-Supported & Psychologically-Pressured (38%) profiles.

FIGURE 3
www.frontiersin.org

Figure 3. Multi-level LPA results showing the classroom-level instructional-psychological profiles.

We then assessed for differences between classroom profiles in classroom-average persistence, disengagement, and achievement. Results are shown in Table 5. For classroom-average persistence, each classroom profile was significantly different from the other. In ascending order of classroom-average persistence were: the Struggling classroom (lowest persistence; M = −0.610), then the Striving classroom (M = −0.068), then the Thriving classroom (highest persistence; M = 0.422). For classroom-average disengagement, each classroom profile was significantly different from the other. In descending order of classroom-average disengagement were: the Struggling classroom (highest disengagement; M = 0.705), then the Striving classroom (M = 0.048), then the Thriving classroom (lowest disengagement; M = −0.427). For classroom-average achievement, with one exception (Striving = Thriving), each classroom profile was significantly different from the other. In ascending order of classroom-average achievement were: the Struggling classroom (lowest achievement; M = −0.498), then the Striving classroom (M = −0.032), then the Thriving classroom (highest achievement; M = 0.182)—but as noted, the latter two classroom profiles were not significantly different from each other in achievement (see Table 5).

TABLE 5
www.frontiersin.org

Table 5. Means (and SEs and 95% CIs) on dependent variables for each instructional-psychological profile.

Discussion

To best understand instructional cognitive load, we have emphasized the importance of assessing it in novel ways to reveal how different students perceive and experience this load. We have further emphasized the importance of utilizing cutting-edge analytic approaches that are appropriate to assessing these novel instrumentations. Integrating cognitive load theory and cognitive appraisal theory, we hypothesized that some students are likely to perceive cognitive load in an approach- and challenge-oriented way, and other students are likely to perceive cognitive load in an avoidant- and threat-oriented way. To the extent this is the case, we suggested that to further understand instructional cognitive load (by way of load reduction instruction; LRI) it is important to do so by also assessing students' accompanying psychological challenge and threat orientations. Adopting a novel person-centered construct validity perspective, we used latent profile analysis (LPA) to identify the network of instructional-psychological profiles based on students' reports of instructional load (LRI) and their accompanying psychological challenge orientations (self-efficacy and growth goals) and psychological threat orientations (anxiety and failure avoidance goals)—student-level within-network validity. Moreover, because students in our study were nested within (science) classrooms, we expanded our analyses to also conduct multilevel LPA to identify a network of student- and classroom-level instructional-psychological profiles—classroom-level within-network validity. We assessed student- and classroom-level between-network validity by investigating associations between the network of derived profiles and the network of student- and classroom-level persistence, disengagement, and achievement outcome variables.

Summary of Findings

At the student-level, we identified five instructional-motivational profiles that represented different presentations of instructional cognitive load, challenge orientation, and threat orientation: Instructionally-Overburdened & Psychologically-Resigned students (8% of students), Instructionally-Burdened & Psychologically-Fearful students (30%), Instructionally-Supported & Psychologically-Composed students (31%), Instructionally-Optimized & Psychologically-Self-Assured students (9%), and Instructionally-Supported & Psychologically-Pressured students (22%). As we describe below, these conform to established theoretical perspectives and thus offer a student-level within-network validation perspective on the nomological network of instructional cognitive load in terms of underlying instructional-psychological orientations. We also demonstrated student-level between-network validity in that these profiles were significantly different in persistence, disengagement, and achievement (beyond the role of background attributes)—with the Instructionally-Overburdened & Psychologically-Resigned profile reflecting the most maladaptive outcomes and the Instructionally-Optimized & Psychologically-Self-Assured profile reflecting the most adaptive outcomes. In multilevel LPAs, we identified three instructional-psychological profiles among classrooms that varied in terms of instructional cognitive load, challenge orientations, and threat orientations: Striving classrooms (36% of the classrooms), Thriving classrooms (42%), and Struggling classrooms (22%). In terms of classroom-level between-network validity, we found that classroom profiles were significantly different in their levels of persistence, disengagement, and achievement—with Struggling classrooms reflecting the most maladaptive outcomes and Thriving classrooms reflecting the most adaptive outcomes, but, notably, equal to the Striving classrooms in achievement.

Findings of Particular Note

In numerous ways this study offers novel contributions to the assessment of instructional cognitive load, including: its person-centered perspective elucidating theoretically plausible student profiles based on their experience of cognitive load and their psychological orientations, the multilevel validity of the Load Reduction Instruction Scale-Short (LRIS-S) in person-centered analyses, and the validity of the links between profiles and academic outcomes. We suggest that findings hold implications for better assessing and understanding students and classrooms in terms of the cognitive load they experience through instruction. Specifically, the results show that assessing instructional load in the context of students' accompanying psychological orientations can reveal unique insights about students' learning experiences and about important differences between classrooms in terms of the instructional load that is present.

The findings supported one of the central premises of this study—namely, that similar levels of perceived instructional load can be accompanied by different levels of perceived challenge and threat. For example, at the student-level we identified two profiles that can be considered instructionally-supported but who varied in their accompanying psychological orientations. Specifically, the Instructionally-Supported & Psychologically-Composed profile experienced moderate levels of LRI, moderate challenge orientation and low threat orientation, whereas the Instructionally-Supported & Psychologically-Pressured profile experienced moderate LRI and challenge orientation but also moderate levels of threat orientation. These two profiles were also significantly different in persistence and achievement outcomes (but not disengagement), with the former profile scoring higher than the latter profile. This is notable because it shows that students with similar levels of instructional load can have different psychological experiences (i.e., differing levels of challenge and threat) that yield significant differences in academic outcomes. This underscores the yield of assessing instructional load in the context of other potentially influential accompanying factors. This requires assessment and analytic approaches that can disentangle students who perceive similar levels of instructional load but who vary on other factors (in our study, psychological challenge and threat orientations).

The Instructionally-Supported & Psychologically-Pressured profile was further illuminating in that it confirmed the existence of the contended dual challenge-threat orientation (or, approach-avoidance motive). As noted earlier, recent reviews of challenge-threat orientations have suggested the dual presence of both challenge and threat among some individuals (Uphill et al., 2019; see also Rogat and Linnenbrink-Garcia, 2019 for dual goals under approach-avoidance goal frameworks). In the case of our study, in the presence of instructional load there were some students who also reported dual challenge and threat orientations—that is, they believed they are up to the challenge of task burden but are also fearful of failure or poor performance, somewhat akin to the “overstrivers” described earlier (Covington, 2000; Martin and Marsh, 2003). Essentially, in the context of instructional load they perceive both an opportunity to succeed and a risk they may fail. Accordingly, we identified these students as Psychologically-Pressured because even though they reflected a challenge orientation, there was an accompanying fear and avoidance (threat) inclination. Moreover, despite their threat orientation, the presence of a concomitant challenge orientation meant they experienced higher academic outcomes relative to the Instructionally-Overburdened & Psychologically-Resigned students and the Instructionally-Burdened & Psychologically-Fearful students. Nonetheless, the dual presence of challenge-threat orientations experienced by the Psychologically-Pressured profile represented a tension that we contend held them back from the more optimal academic outcomes seen in the Psychologically-Composed and Psychologically-Self-Assured profiles; this aligns with recent research that similarly demonstrates that the benefits of challenge orientation can be thwarted when there are similarly high rates of threat (Burns et al., 2020a).

Another interesting finding was that the highest instructional cognitive load (i.e., the lowest LRI scores) was not accompanied by the highest levels of threat orientation. Specifically, the Instructionally-Overburdened & Psychologically-Resigned students reflected lower levels of anxiety and failure avoidance goals than the Instructionally-Burdened & Psychologically-Fearful students and the Instructionally-Supported & Psychologically-Pressured students. It seems that in conditions where the instructional load is most poorly managed (evidenced by the lowest LRI scores), students may abandon any investment in the lesson. According to self-worth theory (Covington, 2000), when students abandon motivationally aversive conditions there can be an alleviation of anxiety and fear as their competence and academic self-worth are no longer “on the line” and under threat. Importantly, however, as they abandon their investment in cognitively burdensome instruction, their academic outcomes also decline—as evidenced by their significantly lower levels of persistence and significantly higher levels of disengagement.

Interestingly, the Instructionally-Overburdened & Psychologically-Resigned students and the Instructionally-Burdened & Psychologically-Fearful students were not significantly different in achievement. Even though the latter profile did not experience such depths of burdensome instruction, this did not yield an achievement advantage for them. Here we again point out the importance of assessing accompanying challenge and threat orientations to understand potentially counter-intuitive effects of instruction on achievement: in this study, it unearthed the fact that Instructionally-Overburdened students were not significantly different in achievement than the Instructionally-Burdened students. The former profile was Psychologically-Resigned whereas the latter profile was Psychologically-Fearful. Again drawing on self-worth theory (Covington, 2000), when students abandon investment in a task demand there can be a concomitant alleviation of anxiety and fear (discussed above) that may mean their performance can be on a par with students who are still invested in the task demand but who are highly anxious and fearful. This is yet another example of how dually assessing instructional cognitive load and psychological orientations can help us better understand instructional effects—namely, assessing concomitant challenge and threat orientations has allowed us to understand why two profiles who differ in instructional load are similar in achievement.

Another novel contribution by this study involved the multilevel analyses that enabled us to identify distinct types of classrooms differentiated in terms of how they varied in instructional load (LRI) and accompanying challenge and threat orientations. Here we unearthed three classroom profiles: Struggling, Striving, and Thriving classrooms. The Struggling classrooms were predominated by a majority of students experiencing significant instructional cognitive (over)load and psychological detachment or fear. In contrast, the Thriving classrooms had almost no students who were cognitively (over)loaded and a majority of students with adaptive challenge orientations. These two classroom profiles may be considered somewhat predictable from a binary perspective, but the third classroom profile (the Striving classroom) was more nuanced and represents both cautionary and aspirational possibilities: cautionary in the sense that if not instructionally- and psychologically-supported, these Striving classrooms risk devolving to Struggling classrooms—but aspirational in that if they are better instructionally- and psychologically-supported, they can elevate to Thriving classrooms. Where the Striving and Thriving classrooms seemed to differ most was in the number of Psychologically-Self-Assured and Psychologically-Composed students (43% of Thriving classrooms; 26% of Striving classrooms)—the implication being that educators would do well to shift students “up” from the Psychologically-Pressured profile to the Psychologically-Self-Assured and Psychologically-Composed profiles. How they can do this is now the focus of discussion.

Implications for Instructional Assessment, Evaluation, and Practice

The findings of this investigation hold implications for instructional assessment, evaluation, and practice. For instructional assessment and evaluation, the study has further demonstrated the validity of instrumentation that enables students to report on the extent to which instruction manages the cognitive load on them as they learn. The Load Reduction Instruction Scale (LRIS; and its brief form, Load Reduction Instruction Scale-Short, LRIS-S) is a student reporting tool that has been purposefully developed for in-class assessment of LRI. To date the LRI has been usefully employed in variable-centered research, and the present study has now revealed its utility in person-centered analyses. Furthermore, because the LRIS is completed in class, if enough classrooms are present in a study (as there were in our study), it can be used in multilevel analyses to gain a sense of LRI at the whole-class level. We therefore encourage the use of tools that enable in-class assessment of load-reducing instruction by students. Indeed, as Martin and Evans (2018) suggested, the LRIS may also be adapted to have teachers reflect on and attend to their own instructional practice.

Also on the matter of instructional assessment and evaluation, person-centered analyses enabled insights into how different subpopulations of students may be similar in LRI but differ in their accompanying challenge-threat orientations—and how students may differ in LRI but be similar in challenge-threat orientations. We therefore recommend that more studies assess instructional cognitive load using person-centered approaches in order to elucidate important (and sometimes quite nuanced) subpopulations of students that would otherwise be masked in variable-centered research. This will require administering instrumentation that can assess accompanying aspects of the learner. We did so via measures of inferred challenge orientation (self-efficacy, growth goals) and threat orientation (anxiety, failure avoidance goals). However, there are other indicators of challenge and threat orientations, such as affective dimensions reflecting perceived challenge-threat (e.g., enjoyment, hope, frustration, depression, anger, boredom, etc.; Pekrun, 2006).

In terms of educational practice, because the LRIS is founded on (and assesses) an instructional framework comprising five key principles, educators can be quite specifically guided in professional learning targeting these instructional principles. Martin et al. (2020a; see also Martin, 2016; Martin and Evans, 2018, 2019) have described numerous pedagogical strategies that follow from the five principles of LRI. For example, to reduce difficulty in the initial stages of learning as appropriate to the learner's prior knowledge (principle #1), they suggest pre-testing to gain a sense of where to pitch content, pre-training, and segmenting (or, “chunking”) (Pollock et al., 2002; Mayer and Moreno, 2010; Delahay and Lovett, 2019). For support and scaffolding (principle #2), suggestions include structured templates, worked examples, prompting, and advance and graphic organizers (e.g., Renkl and Atkinson, 2010; Sweller, 2012; Berg and Wehby, 2013; Renkl, 2014; Hughes et al., 2019). For sufficient practice (principle #3), deliberate practice and mental rehearsal have been recommended (e.g., Ginns, 2005; Purdie and Ellis, 2005; Nandagopal and Ericsson, 2012; Sweller, 2012). For feedback-feedforward (principle #4), corrective and improvement-oriented information has been proposed (e.g., Basso and Belardinelli, 2006; Hattie and Timperley, 2007; Shute, 2008; Hattie, 2009; Martin and Evans, 2018). For more independent and self-directed learning (principle #5), guided discovery learning has been suggested (e.g., Mayer, 2004).

There are also strategies that can foster students' challenge orientations and reduce their threat orientations. For the former, self-efficacy and growth goals were the means through which we inferred challenge orientation, and these have distinct practice implications. For self-efficacy, educators might encourage students to challenge any negative self-beliefs, especially when they are faced with difficult academic tasks (Wigfield and Tonks, 2002). Reminding students of their strengths and reiterating what they have already learned can also enhance self-efficacy (Higgins et al., 2001; Martin et al., 2019). Regarding growth goals, intervention research has demonstrated that encouraging students to set self-improvement targets (personal best goals) and teaching them how to strive to meet these targets are successful strategies (e.g., Martin and Elliot, 2016b; Ginns et al., 2018).

Anxiety and failure avoidance goals were the means through which we inferred students' threat orientation, and these also have distinct practice implications. For anxiety, there are three types of programs that tend to be offered in schools: universal programs targeting all students, selective programs targeting students at-risk of anxiety at clinical levels, and specific programs targeting students who have clinical levels of anxiety (Martin et al., 2021). Within each of these programs, cognitive–behavioral approaches tend to be successful (Neil and Christensen, 2009); here, students are specifically taught cognitive and behavioral strategies for anxiety reduction, especially for times and circumstances when anxiety is likely to strike. The use of mindfulness techniques by educators with students is another suggested strategy to reduce anxiety. In similar vein, growth mindset intervention has been found to improve individuals' stress and threat appraisals (Yeager et al., 2016). Mindfulness intervention benefits for students with negative self-beliefs have also been highlighted in several studies and reviews (Weare, 2013; Sibinga et al., 2015; McKeering and Hwang, 2019). To address students' inclination to adopt failure avoidance goals, educators are urged to reduce students' fear of failure (Covington, 2000; Martin and Marsh, 2003; Martin, 2007, 2009). Practical strategies to do this include promoting the belief that effort underpins self-improvement and does not imply a lack of ability or intelligence and making it clear that mistakes can be important ingredients for future success and do not reflect poorly on one's self-worth (Covington, 2000; Martin and Marsh, 2003).

Limitations and Future Directions

When interpreting findings there are some limitations worth noting and which have implications for future research. First, this study relied on student reports of LRI, via the LRIS. Although the validity of this methodology has previously been demonstrated (e.g., Martin and Evans, 2018) and the psychometrics in the present study were acceptable, future research might include additional indicators such as observer ratings or self-reports by teachers to triangulate with student ratings. Second, we used the short form of the LRIS, which meant we could not estimate latent profiles on the basis of the 5 LRI principles considered separately. Future research should consider this possibility and also (using the long form) look to estimate classroom profiles (L2) characterized by different levels on these 5 principles (rather than reflecting different frequencies of the L1 profiles). For example, starting from multilevel CFA models, L2 factor scores (corrected for inter-rater disagreement) can be saved, enabling more objective ratings of the classroom. Then the L1 and L2 factor scores from this model can be used to separately estimate L1 and L2 profiles. Third, there may be instructional principles that effectively manage cognitive load on learners, but which are not in the LRI framework. To the extent additional principles are identified and can be validly assessed, we recommend including them in future research. Fourth, our data were cross-sectional which means, for example, that we were unable to determine causal ordering between the profiles and the outcomes, nor whether student and classroom profile memberships change over time. Longitudinal data and modeling (e.g., latent transition analysis) will be an important avenue in future research (Collie et al., 2020). Fifth, our study included self-efficacy and growth goals to infer challenge orientation and anxiety and failure avoidance goals to infer threat orientation. There is a need for research that assesses other indicators of challenge and threat to test the generality of our findings. For example, testing affective dimensions of perceived challenge-threat such as enjoyment, frustration, anger, boredom, etc. (Pekrun, 2006) and other challenge/approach-oriented goals such as mastery goals (Elliot, 2006) may be illuminating. There may also be potential gains in harnessing bio-psychological measures of challenge and threat in order to access real-time and more objective measures for further triangulation (Uphill et al., 2019; Martin et al., 2021). Neuro-psychological measures may additionally provide real-time indicators of experienced cognitive load. These may have the potential to deepen evaluation and understanding of LRI and its associations with challenge and threat demonstrated in this research (Berka et al., 2007; Anderson et al., 2011; Delahunty et al., 2018). Sixth, our research took place in science which is a challenging school subject (Coe et al., 2008) and one in which many students can struggle (Office of the Chief Scientist, 2014). Threat orientation may be disproportionately salient in such subjects. There is a need to explore the generalizability of our findings to other school subjects. Indeed, there is a need for research in non-science school subjects because it may be that science is more (or less) amenable to LRI. Finally, when testing for profiles in which accompanying indicators are hypothesized to be present, future research might give greater attention to real-time research methodologies. The in-situ dimensions of students' science engagement have been emphasized by researchers (e.g., Schneider et al., 2016) and the empirical yields of real-time engagement research has been demonstrated in other STEM subjects such as mathematics (Martin et al., 2020b).

Conclusion

Instructional cognitive load is perceived and experienced in different ways by different students. Some students perceive cognitive load in an approach- and challenge-oriented way, while other students perceive cognitive load in an avoidant- and threat-oriented way. To better understand instructional cognitive load, it is important to assess students' experiences of this load in the context of their accompanying psychological challenge and threat orientations. The present study did so using multilevel latent profile analysis and identified numerous instructional-psychological profiles among students and also salient instructional-psychological profiles among classrooms. These profiles were further illuminated through their associations with student- and classroom-level persistence, disengagement, and achievement. The findings of this investigation have demonstrated that assessing instructional cognitive load in the context of students' accompanying psychological orientations can reveal unique insights about students' learning experiences and about important differences between classrooms in terms of the instructional cognitive load that is present.

Data Availability Statement

The datasets presented in this article are not readily available because they are part of an industry research partnership; consent from participants to share the dataset is not available; summative data (e.g., correlation matrix with standard deviations) are available here and elsewhere to enable analyses. Requests to access the datasets should be directed to Andrew J. Martin, andrew.martin@unsw.edu.au.

Ethics Statement

The studies involving human participants were reviewed and approved by UNSW Human Ethics Committee. Written informed consent to participate in this study was provided by the participants' legal guardian/next of kin.

Author Contributions

AM led research design, data analysis, and report writing. PG and RC assisted with data analysis and report writing. EB and RK assisted with research design, interpretation of findings, and report writing. VM-S assisted with report writing. JP assisted with research design and report writing. All authors contributed to the article and approved the submitted version.

Funding

This study was funded by the Australian Research Council (Grant #LP170100253) and The Future Project at The King's School.

Conflict of Interest

It is appropriate to note that one of the measures (the MES) in the study is a published instrument attracting a small fee (approx. US$110 per 1,000 respondents) part of which is put toward its ongoing development and administration and part of which is also donated to UNICEF. However, for this study, there was no fee involved for its use.

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Acknowledgments

The authors thank the participating schools for assisting with data collection and Carolyn Imre and Brad Papworth for advice on study design and analysis.

Supplementary Material

The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/fpsyg.2021.656994/full#supplementary-material

Footnotes

1. ^There were 2,199 students in the original sample, but we removed students who did not identify their classroom (n = 90) as this information is necessary for multi-level analyses. Also removed were classes with fewer than 3 students, as we considered these class sizes too small to yield reliable estimates at the classroom level (viz. 38 students from 25 classes; see McNeish, 2014).

References

Anderson, E. W., Potter, K. C., Matzen, L. E., Shepherd, J. F., Preston, G. A., and Silva, C. T. (2011). A user study of visualization effectiveness using EEG and cognitive load. Comp. Graph. Forum 30, 791–800. doi: 10.1111/j.1467-8659.2011.01928.x

CrossRef Full Text | Google Scholar

Arbuckle, J. L. (1996). “Full information estimation in the presence of incomplete data,” in Advanced Structural Equation Modeling: Issues and Techniques, eds G. A. Marcoulides and R. E. Schumacker (Lawrence Erlbaum Associates),243–278.

Australian Bureau of Statistics (2019). Schools Australia. ABS.

Bandura, A. (1997). Self-Efficacy: The Exercise of Control. Freeman.

Google Scholar

Basso, D., and Belardinelli, M. O. (2006). The role of the feedforward paradigm in cognitive psychology. Cogn. Process. 7, 73–88. doi: 10.1007/s10339-006-0034-1

PubMed Abstract | CrossRef Full Text | Google Scholar

Bauer, D. J., and Curran, P. J. (2004). The integration of continuous and discrete latent variable models: potential problems and promising opportunities. Psychol. Methods 9, 3–29. doi: 10.1037/1082-989X.9.1.3

PubMed Abstract | CrossRef Full Text | Google Scholar

Berg, J. L., and Wehby, J. (2013). Preteaching strategies to improve student learning in content area classes. Interv. School Clinic 49, 14–20. doi: 10.1177/1053451213480029

CrossRef Full Text | Google Scholar

Berka, C., Levendowski, D. J., Lumicao, M. N., Yau, A., Davis, G., Zivkovic, V. T., et al. (2007). EEG correlates of task engagement and mental workload in vigilance, learning, and memory tasks. Aviat. Space Environ. Med. 78, B231–B244.

PubMed Abstract | Google Scholar

Britton, J. C., Lissek, S., Grillon, C., Norcross, M. A., and Pine, D. S. (2011). Development of anxiety: the role of threat appraisal and fear learning. Depress. Anxiety 28, 5–17. doi: 10.1002/da.20733

PubMed Abstract | CrossRef Full Text | Google Scholar

Burns, E. C., Martin, A. J., and Collie, R. J. (2018). Adaptability, personal best (PB) goal setting, and gains in students' academic outcomes: a longitudinal examination from a social cognitive perspective. Contemp. Educ. Psychol. 53, 57–72. doi: 10.1016/j.cedpsych.2018.02.001

CrossRef Full Text | Google Scholar

Burns, E. C., Martin, A. J., and Collie, R. J. (2019). Understanding the role of personal best (PB) goal setting in students' declining engagement: a latent growth model. J. Educ. Psychol. 111, 557–572. doi: 10.1037/edu0000291

CrossRef Full Text | Google Scholar

Burns, E. C., Martin, A. J., Kennett, R. K., Pearson, J., and Munro-Smith, V. (2020a). Optimizing science self-efficacy: a multilevel examination of the moderating effects of anxiety on the relationship between self-efficacy and achievement in science. Contemp. Educ. Psychol. doi: 10.1016/j.cedpsych.2020.101937

CrossRef Full Text | Google Scholar

Burns, E. C., Martin, A. J., Mansour, M., Anderson, M., Gibson, R., and Liem, G. A. D. (2020b). Motivational processes that support arts participation: An examination of goal orientations and aspirations. Psychol. Aesthet. Creat. Arts 14, 384–400. doi: 10.1037/aca0000242

CrossRef Full Text | Google Scholar

Campbell, D. T., and Fiske, D. W. (1959). Convergent and discriminant validation by the multitrait-multimethod matrix. Psychol. Bull. 56, 81–105. doi: 10.1037/h0046016

PubMed Abstract | CrossRef Full Text | Google Scholar

Chandler, P., and Sweller, J. (1991). Cognitive load theory and the format of instruction. Cogn. Instruct. 8, 293–332. doi: 10.1207/s1532690xci0804_2

CrossRef Full Text | Google Scholar

Chen, O., Kalyuga, S., and Sweller, J. (2017). The expertise reversal effect is a variant of the more general element interactivity effect. Educ. Psychol. Rev. 29, 393–405. doi: 10.1007/s10648-016-9359-1

CrossRef Full Text | Google Scholar

Coe, R., Searle, J., Barmby, P., Jones, K., and Higgins, S. (2008). Relative Difficulty of Examinations in Different Subjects. Report for SCORE (Science Community Supporting Education). Centre for Evaluation and Modelling; Durham University.

Google Scholar

Collie, R. J., Malmberg, L.-E., Martin, A. J., Sammons, P., and Morin, A. (2020). A multilevel person-centered examination of teachers' workplace demands and resources: links with work-related well-being. Front. Psychol. 11:626. doi: 10.3389/fpsyg.2020.00626

PubMed Abstract | CrossRef Full Text | Google Scholar

Collie, R. J., Shapka, J. D., Perry, N. E., and Martin, A. J. (2015). Teachers' beliefs about social-emotional learning: identifying teacher profiles and their relations with job stress and satisfaction. Learn. Instruct. 39, 148–157. doi: 10.1016/j.learninstruc.2015.06.002

CrossRef Full Text | Google Scholar

Covington, M. V. (2000). Achievement: an integrative review. Annu. Rev. Psychol. 51, 171–200. doi: 10.1146/annurev.psych.51.1.171

CrossRef Full Text | Google Scholar

Delahay, A. B., and Lovett, M. C. (2019). “Distinguishing two types of prior knowledge that support novice learners,” in Proceedings of the 41st Annual Meeting of the Cognitive Science Society, CogSci, eds A. Goel, C. M. Seifert, and C. Freska Goel (Montreal, QC).

Google Scholar

Delahunty, T., Seery, N., and Lynch, R. (2018). Exploring the use of electroencephalography to gather objective evidence of cognitive processing during problem solving. J. Sci. Educ. Technol. 27, 114–130. doi: 10.1007/s10956-017-9712-2

CrossRef Full Text | Google Scholar

Elliot, A., Murayama, K., Kobeisy, A., and Lichtenfeld, S. (2015). Potential-based achievement goals. Br. J. Educ. Psychol. 85, 192–206. doi: 10.1111/bjep.12051

CrossRef Full Text | Google Scholar

Elliot, A. J. (2006). The hierarchical model of approach-avoidance motivation. Motiv. Emot. 30, 111–116. doi: 10.1007/s11031-006-9028-7

CrossRef Full Text | Google Scholar

Elliot, A. J., Murayama, K., and Pekrun, R. (2011). A 3 × 2 achievement goal model. J. Educ. Psychol. 103, 632–648. doi: 10.1037/a0023952

CrossRef Full Text | Google Scholar

Evans, P., and Martin, A. J. (Submitted). Load reduction instruction: Multilevel effects on motivation, engagement, achievement in mathematics.

Ghaderyan, P., Abbasi, A., and Ebrahimi, A. (2018). Time-varying singular value decomposition analysis of electrodermal activity: a novel method of cognitive load estimation. Measurement 126, 102–109. doi: 10.1016/j.measurement.2018.05.015

CrossRef Full Text | Google Scholar

Ginns, P. (2005). Meta-analysis of the modality effect. Learn. Instruct. 15, 313–331. doi: 10.1016/j.learninstruc.2005.07.001

CrossRef Full Text | Google Scholar

Ginns, P., Martin, A. J., Durksen, T. L., Burns, E. C., and Pope, A. (2018). Personal Best (PB) goal-setting enhances arithmetical problem-solving. Aust. Educ. Res. 45, 533–551. doi: 10.1007/s13384-018-0268-9

CrossRef Full Text | Google Scholar

Goldstein, H. (2003). Multilevel Statistical Models, 3rd Edn. Hodder Arnold.

Google Scholar

Green, J., Martin, A. J., and Marsh, H. W. (2007). Motivation and engagement in English, mathematics and science high school subjects: towards an understanding of multidimensional domain specificity. Learn. Ind. Diff. 17, 269–279. doi: 10.1016/j.lindif.2006.12.003

CrossRef Full Text | Google Scholar

Hattie, J. (2009). Visible Learning: A Synthesis of Over 800 Meta-Analyses Relating to Achievement. Routledge.

Google Scholar

Hattie, J., and Timperley, H. (2007). The power of feedback. Rev. Educ. Res. 77, 81–112. doi: 10.3102/003465430298487

CrossRef Full Text | Google Scholar

Higgins, E. T., Friedman, R. S., Harlow, R. E., Idson, L. C., Ayduk, O. N., and Taylor, A. (2001). Achievement orientations from subjective histories of success: promotion pride versus prevention pride. Eur. J. Soc. Psychol. 31, 3–23. doi: 10.1002/ejsp.27

CrossRef Full Text | Google Scholar

Howard, S. J., Burianov,á, H., Ehrich, J., Kervin, L., Calleia, A., Barkus, E., et al. (2015). Behavioral and fMRI evidence of the differing cognitive load of domain-specific assessments. Neuroscience 297, 38–46. doi: 10.1016/j.neuroscience.2015.03.047

PubMed Abstract | CrossRef Full Text | Google Scholar

Hughes, M. D., Regan, K. S., and Evmenova, A. (2019). A computer-based graphic organizer with embedded self-regulated learning strategies to support student writing. Intervent. School Clinic. 55, 13–22. doi: 10.1177/1053451219833026

CrossRef Full Text | Google Scholar

Ikehara, C. S., and Crosby, M. E. (2005). “Assessing cognitive load with physiological sensors,” in Proceedings of the 38th Annual Hawaii International Conference on System Sciences (IEEE), 295a−295a.

Google Scholar

Kalyuga, S., Rikers, R., and Paas, F. (2012). Educational implications of expertise reversal effects in learning and performance of complex cognitive and sensorimotor skills. Educ. Psychol. Rev. 24, 313–337. doi: 10.1007/s10648-012-9195-x

CrossRef Full Text | Google Scholar

Klepsch, M., and Seufert, T. (2020). Understanding instructional design effects by differentiated measurement of intrinsic, extraneous, and germane cognitive load. Instruct. Sci. 48, 45–77. doi: 10.1007/s11251-020-09502-9

CrossRef Full Text | Google Scholar

Krell, M. (2017). Evaluating an instrument to measure mental load and mental effort considering different sources of validity evidence. Cogent Educ. 4:1280256. doi: 10.1080/2331186X.2017.1280256

CrossRef Full Text | Google Scholar

Lau, S., and Nie, Y. (2008). Interplay between personal goals and classroom goal structures in predicting student outcomes: a multilevel analysis of person-context interactions. J. Educ. Psychol. 100, 15–29. doi: 10.1037/0022-0663.100.1.15

CrossRef Full Text | Google Scholar

Lazarus, R. S., and Folkman, S. (1984). Stress, Appraisal, and Coping. Springer.

Google Scholar

Leppink, J., Paas, F., Van der Vleuten, C. P., Van Gog, T., and Van Merriënboer, J. J. (2013). Development of an instrument for measuring different types of cognitive load. Behav. Res. Methods 45, 1058–1072. doi: 10.3758/s13428-013-0334-1

PubMed Abstract | CrossRef Full Text | Google Scholar

Litalien, D., Gillet, N., Gagné, M., Ratelle, C. F., and Morin, A. J. S. (2019). Self-determined motivation profiles among undergraduate students: A robust test of profile similarity as a function of gender and age. Learn. Ind. Diff. 70, 39–52. doi: 10.1016/j.lindif.2019.01.005

CrossRef Full Text | Google Scholar

Marsh, H. W. (2002). A multidimensional physical self-concept: a construct validity approach to theory, measurement, and research. Psychol. J. Hellen. Psychol. Soc. 9, 459–493.

Google Scholar

Marsh, H. W., Lüdtke, O., Nagengast, B., Trautwein, U., Morin, A. J., Abduljabbar, A. S., et al. (2012). Classroom climate and contextual effects: conceptual and methodological issues in the evaluation of group-level effects. Educ. Psychol. 47, 106–124. doi: 10.1080/00461520.2012.670488

CrossRef Full Text | Google Scholar

Marsh, H. W., Lüdtke, O., Robitzsch, A., Trautwein, U., Asparouhov, T., Muthén, B., et al. (2009). Doubly-latent models of school contextual effects: integrating multilevel and structural equation approaches to control measurement and sampling error. Multivariate Behav. Res. 44, 764–802. doi: 10.1080/00273170903333665

PubMed Abstract | CrossRef Full Text | Google Scholar

Marsh, H. W., Martin, A. J., and Hau, K. T. (2006). “A multiple method perspective on self-concept research in educational psychology: a construct validity approach,” in Handbook of Multimethod Measurement in Psychology, eds M. Eid and E. Diener (American Psychological Association Press), 441–456. doi: 10.1037/11383-030

CrossRef Full Text | Google Scholar

Martin, A. J. (2007). Examining a multidimensional model of student motivation and engagement using a construct validation approach. Br. J. Educ. Psychol. 77, 413–440. doi: 10.1348/000709906X118036

PubMed Abstract | CrossRef Full Text | Google Scholar

Martin, A. J. (2009). Motivation and engagement across the academic lifespan: a developmental construct validity study of elementary school, high school, and university/college students. Educ. Psychol. Meas. 69, 794–824. doi: 10.1177/0013164409332214

CrossRef Full Text | Google Scholar

Martin, A. J. (2016). Using Load Reduction Instruction (LRI) to boost motivation and engagement. Br. Psychol. Soc.

Google Scholar

Martin, A. J., Anderson, J., Bobis, J., Way, J., and Vellar, R. (2012). Switching on and switching off in mathematics: an ecological study of future intent and disengagement amongst middle school students. J. Educ. Psychol. 104, 1–18. doi: 10.1037/a0025988

CrossRef Full Text | Google Scholar

Martin, A. J., and Elliot, A. J. (2016a). The role of personal best (PB) and dichotomous achievement goals in students' academic motivation and engagement: a longitudinal investigation. Educ. Psychol. 36, 1285–1302. doi: 10.1080/01443410.2015.1093606

CrossRef Full Text | Google Scholar

Martin, A. J., and Elliot, A. J. (2016b). The role of personal best (PB) goal setting in students' academic achievement gains. Learn. Ind. Diff. 45, 222–227. doi: 10.1016/j.lindif.2015.12.014

CrossRef Full Text | Google Scholar

Martin, A. J., and Evans, P. (2018). Load reduction instruction: exploring a framework that assesses explicit instruction through to independent learning. Teach. Teach. Educ. 73, 203–214. doi: 10.1016/j.tate.2018.03.018

CrossRef Full Text | Google Scholar

Martin, A. J., and Evans, P. (2019). “Load reduction instruction: Sequencing explicit instruction and guided discovery to enhance students' motivation, engagement, learning, and achievement,” in Advances in Cognitive Load Theory: Rethinking Teaching, eds S. Tindall-Ford, S. Agostinho, and J. Sweller (Routledge). 15–29.

Google Scholar

Martin, A. J., Ginns, P., Burns, E., Kennett, R., and Pearson, J. (2020a). Load reduction instruction in science and students' science engagement and science achievement. J. Educ. Psychol. doi: 10.1037/edu0000552

CrossRef Full Text | Google Scholar

Martin, A. J., Kennett, R., Pearson, J., Mansour, M., Papworth, B., and Malmberg, L.-E. (2021). Challenge and threat appraisals in high school science: investigating the roles of psychological and physiological factors. Educ. Psychol. doi: 10.1080/01443410.2021.1887456

CrossRef Full Text | Google Scholar

Martin, A. J., and Liem, G. A. (2010). Academic personal bests (PBs), engagement, and achievement: a cross-lagged panel analysis. Learn. Ind. Diff. 20, 265–270. doi: 10.1016/j.lindif.2010.01.001

CrossRef Full Text | Google Scholar

Martin, A. J., Malmberg, L.-E., Kennett, R., Mansour, M., Papworth, B., and Pearson, J. (2019). What happens when students reflect on their self-efficacy during a test? Exploring test experience and test outcome in science. Learn. Ind. Diff. 73, 59–66. doi: 10.1016/j.lindif.2019.05.005

CrossRef Full Text | Google Scholar

Martin, A. J., Mansour, M., and Malmberg, L.-E. (2020b). What factors influence students' real-time motivation and engagement? An experience sampling study of high school students using mobile technology. Educ. Psychol. 40, 1113–1135. doi: 10.1080/01443410.2018.1545997

CrossRef Full Text | Google Scholar

Martin, A. J., and Marsh, H. W. (2003). Fear of failure: friend or foe? Aust. Psychol. 38, 31–38. doi: 10.1080/00050060310001706997

CrossRef Full Text | Google Scholar

Mayer, R. E. (2004). Should there be a three-strikes rule against pure discovery learning? The case for guided methods of instruction. Am. Psychol. 59, 14–19. doi: 10.1037/0003-066X.59.1.14

PubMed Abstract | CrossRef Full Text | Google Scholar

Mayer, R. E., and Moreno, R. (2010). “Techniques that reduce extraneous cognitive load and manage intrinsic cognitive load during multimedia learning,” in Cognitive Load Theory, eds J. L. Plass, R. Moreno, and R. Brunken (Cambridge University Press), 131–152. doi: 10.1017/CBO9780511844744.009

CrossRef Full Text | Google Scholar

McKeering, P., and Hwang, Y. S. (2019). A systematic review of mindfulness-based school interventions with early adolescents. Mindfulness 10, 593–610. doi: 10.1007/s12671-018-0998-9

CrossRef Full Text | Google Scholar

McLarnon, M. J. W., and O'Neill, T. A. (2018). Extensions of auxiliary variable approaches for the investigation of mediation, moderation, and conditional effects in mixture models. Organ. Res. Methods 21, 955–982. doi: 10.1177/1094428118770731

CrossRef Full Text | Google Scholar

McNeish, D. M. (2014). Modeling sparsely clustered data: design-based, model-based, and single-level methods. Psychol. Methods 19, 552–563. doi: 10.1037/met0000024

PubMed Abstract | CrossRef Full Text | Google Scholar

Moors, A., Ellsworth, P. C., Scherer, K. R., and Frijda, N. H. (2013). Appraisal theories of emotion: state of the art and future development. Emot. Rev. 5, 119–124. doi: 10.1177/1754073912468165

CrossRef Full Text | Google Scholar

Morin, A. J. S., Boudrias, J. S., Marsh, H. W., McInerney, D. M., Dagenais-Desmarais, V., and Litalien, D. (2017). Complementary variable- and person-centered approaches to the dimensionality of psychometric constructs: Approaches to psychological wellbeing at work. J. Bus. Psychol. 32, 395–419. doi: 10.1007/s10869-016-9448-7

CrossRef Full Text | Google Scholar

Morin, A. J. S., and Litalien, D. (2017). Longitudinal Tests of Profile Similarity and Latent Transition Analyses. Montreal, QC: Substantive Methodological Synergy Research Laboratory.

Morin, A. J. S., Meyer, J. P., Creusier, J., and Biétry, F. (2016). Multiple-group analysis of similarity in latent profile solutions. Organ. Res. Methods 19, 231–254. doi: 10.1177/1094428115621148

CrossRef Full Text | Google Scholar

Muthén, L. K., and Muthén, B. O. (2017). Mplus [computer software]. Muthén & Muthén.

Nandagopal, K., and Ericsson, K. A. (2012). “Enhancing students' performance in traditional education: implications from the expert performance approach and deliberate practice,” in APA Educational Psychology Handbook, eds K. R. Harris, S. Graham, and T. Urdan (American Psychological Association), 257–293. doi: 10.1037/13273-010

CrossRef Full Text | Google Scholar

Neil, A. L., and Christensen, H. (2009). Efficacy and effectiveness of school-based prevention and early intervention programs for anxiety. Clin. Psychol. Rev. 29, 208–215. doi: 10.1016/j.cpr.2009.01.002

PubMed Abstract | CrossRef Full Text | Google Scholar

OECD. (2020). Education GPS. OECD

Office of the Chief Scientist (2014). Benchmarking Australian Science. Canberra: Australian Government.

Google Scholar

Paas, F., Renkl, A., and Sweller, J. (2003). Cognitive load theory and instructional design: recent developments. Educ. Psychol. 38, 1–4. doi: 10.1207/S15326985EP3801_1

CrossRef Full Text | Google Scholar

Paas, F. G., Van Merriënboer, J. J., and Adam, J. J. (1994). Measurement of cognitive load in instructional research. Percept. Motor Skills 79, 419–430. doi: 10.2466/pms.1994.79.1.419

PubMed Abstract | CrossRef Full Text | Google Scholar

Pekrun, R. (2006). The control-value theory of achievement emotions: assumptions, corollaries, and implications for educational research and practice. Educ. Psychol. Rev. 18, 315–341. doi: 10.1007/s10648-006-9029-9

CrossRef Full Text | Google Scholar

Pollock, E., Chandler, P., and Sweller, J. (2002). Assimilating complex information. Learn. Instruct. 12, 61–86. doi: 10.1016/S0959-4752(01)00016-0

CrossRef Full Text | Google Scholar

Purdie, N., and Ellis, L. (2005). A Review of the Empirical Evidence Identifying Effective Interventions and Teaching Practices for Students With Learning Difficulties in Years 4, 5, and 6. Australian Council for Educational Research. Available online at: https://research.acer.edu.au/tll_misc/7

Google Scholar

Putwain, D. W., Remedios, R., and Symes, W. (2015). Experiencing fear appeals as a challenge or a threat influences attainment value and academic self-efficacy. Learn. Instruct. 40, 21–28. doi: 10.1016/j.learninstruc.2015.07.007

CrossRef Full Text | Google Scholar

Putwain, D. W., and Symes, W. (2014). Subjective value and academic self-efficacy: the appraisal of fear appeals used prior to a high-stakes test as threatening or challenging. Soc. Psychol. Educ. 17, 229–248. doi: 10.1007/s11218-014-9249-7

CrossRef Full Text | Google Scholar

Putwain, D. W., and Symes, W. (2016). The appraisal of value-promoting messages made prior to a high-stakes mathematics examination: the interaction of message-focus and student characteristics. Soc. Psychol. Educ. 19, 325–343. doi: 10.1007/s11218-016-9337-y

CrossRef Full Text

Putwain, D. W., Symes, W., and Wilkinson, H. M. (2017). Fear appeals, engagement, and examination performance: the role of challenge and threat appraisals. Br. J. Educ. Psychol. 87, 16–31. doi: 10.1111/bjep.12132

PubMed Abstract | CrossRef Full Text | Google Scholar

Raudenbush, S. W., and Bryk, A. S. (2002). Hierarchical Linear Models: Applications and Data Analysis Methods, 2nd Edn. Sage.

Google Scholar

Raykov, T., and Marcoulides, G. A. (2004). Using the delta method for approximate interval estimation of parameter functions in SEM. Struct. Equat. Model. 11, 621–637. doi: 10.1207/s15328007sem1104_7

CrossRef Full Text | Google Scholar

Renkl, A. (2014). Toward an instructionally oriented theory of example-based learning. Cogn. Sci. 38, 1–37. doi: 10.1111/cogs.12086

PubMed Abstract | CrossRef Full Text | Google Scholar

Renkl, A., and Atkinson, R. K. (2010). “Learning from worked-out examples and problem solving,” in Cognitive Load Theory, eds J. L. Plass, R. Moreno, and R. Brunken (Cambridge University Press), 91–108. doi: 10.1017/CBO9780511844744.007

CrossRef Full Text | Google Scholar

Rogat, T. K., and Linnenbrink-Garcia, L. (2019). Demonstrating competence within one's group or in relation to other groups: a person-oriented approach to studying achievement goals in small groups. Contemp. Educ. Psychol. 59:101781. doi: 10.1016/j.cedpsych.2019.101781

CrossRef Full Text | Google Scholar

Roseman, I. J., and Smith, C. A. (2001). “Appraisal theory: overview, assumptions, varieties, controversies,” in Series in Affective Science. Appraisal Processes in Emotion: Theory, Methods, Research, eds K. R. Scherer, A. Schorr, and T. Johnstone (Oxford University Press), 3–19.

Google Scholar

Rosenshine, B. V. (2009). “The empirical support for direct instruction,” in Constructivist Instruction: Success or Failure?, eds S. Tobias and T. M. Duffy (Routledge) 201–220.

Google Scholar

Schneider, B., Krajcik, J., Lavonen, J., Salmela-Aro, K., Broda, M., Spicer, J., et al. (2016). Investigating optimal learning moments in US and Finnish science classes. J. Res. Sci. Teach. 53, 400–421. doi: 10.1002/tea.21306

CrossRef Full Text | Google Scholar

Schunk, D. H., and DiBenedetto, M. K. (2014). “Academic self-efficacy,” in Handbook of Positive Psychology in Schools, eds M. J. Furlong, R. Gilman, and E. S. Huebner (Elsevier), 115–521.

Google Scholar

Shute, V. J. (2008). Focus on formative feedback. Rev. Educ. Res. 78, 153–189. doi: 10.3102/0034654307313795

CrossRef Full Text | Google Scholar

Sibinga, E. M., Webb, L., Ghazarian, S. R., and Ellen, J. M. (2015). School-based mindfulness instruction: An RCT. Pediatrics 137, 1–8. doi: 10.1542/peds.2015-2532

CrossRef Full Text | Google Scholar

Sweller, J. (2004). Instructional design consequences of an analogy between evolution by natural selection and human cognitive architecture. Instruct. Sci. 32, 9–31. doi: 10.1023/B:TRUC.0000021808.72598.4d

CrossRef Full Text | Google Scholar

Sweller, J. (2012). “Human cognitive architecture: why some instructional procedures work and others do not,” in APA Educational Psychology Handbook, K. R. Harris, S. Graham, and T. Urdan (American Psychological Association), 295–325. doi: 10.1037/13273-011

CrossRef Full Text | Google Scholar

Sweller, J., Ayres, P., and Kalyuga, S. (2011). Cognitive Load Theory. Springer. doi: 10.1007/978-1-4419-8126-4

CrossRef Full Text | Google Scholar

Tröbst, S., Kleickmann, T., Lange-Schubert, K., Rothkopf, A., and Möller, K. (2016). Instruction and students' declining interest in science: an analysis of German fourth-and sixth-grade classrooms. Am. Educ. Res. J. 53, 162–193. doi: 10.3102/0002831215618662

CrossRef Full Text | Google Scholar

Uphill, M. A., Rossato, C., Swain, J., and O'Driscoll, J. M. (2019). Challenge and threat: a critical review of the literature and an alternative conceptualization. Front. Psychol. 10:1255. doi: 10.3389/fpsyg.2019.01255

PubMed Abstract | CrossRef Full Text | Google Scholar

Van Yperen, N. W., Blaga, M., and Postmes, T. (2015). A meta-analysis of the impact of situationally induced achievement goals on task performance. Hum. Perf. 28, 165–182. doi: 10.1080/08959285.2015.1006772

CrossRef Full Text | Google Scholar

Vermunt, J. K. (2010). Latent class modeling with covariates: two improved three-step approaches. Polit. Anal. 18, 450–469. doi: 10.1093/pan/mpq025

CrossRef Full Text | Google Scholar

Wang, Q., Yang, S., Liu, M., Cao, Z., and Ma, Q. (2014). An eye-tracking study of website complexity from cognitive load perspective. Decis. Support Syst. 62, 1–10. doi: 10.1016/j.dss.2014.02.007

CrossRef Full Text | Google Scholar

Weare, K. (2013). Developing mindfulness with children and young people: a review of the evidence and policy context. J. Child. Serv. 8, 141–153. doi: 10.1108/JCS-12-2012-0014

CrossRef Full Text | Google Scholar

Wigfield, A., and Tonks, S. (2002). “Adolescents' expectancies for success and achievement task values during middle and high school years,” in Academic Motivation of Adolescents, eds F. Pajares and T. Urdan (Information Age Publishing), 58–82.

Google Scholar

Yeager, D. S., Lee, H. Y., and Jamieson, J. P. (2016). How to improve adolescent stress responses: insights from integrating implicit theories of personality and biopsychosocial models. Psychol. Sci. 27, 1078–1091. doi: 10.1177/0956797616649604

PubMed Abstract | CrossRef Full Text | Google Scholar

Yu, K., and Martin, A. J. (2014). Personal best (PB) and 'classic' achievement goals in the Chinese context: their role in predicting academic motivation, engagement, and buoyancy. Educ. Psychol. 34, 635–658. doi: 10.1080/01443410.2014.895297

CrossRef Full Text | Google Scholar

Keywords: cognitive load, load reduction instruction, cognitive appraisal, engagement, achievement, latent profile analysis, multilevel, construct validity

Citation: Martin AJ, Ginns P, Burns EC, Kennett R, Munro-Smith V, Collie RJ and Pearson J (2021) Assessing Instructional Cognitive Load in the Context of Students' Psychological Challenge and Threat Orientations: A Multi-Level Latent Profile Analysis of Students and Classrooms. Front. Psychol. 12:656994. doi: 10.3389/fpsyg.2021.656994

Received: 21 January 2021; Accepted: 27 May 2021;
Published: 01 July 2021.

Edited by:

Kate M. Xu, Open University of the Netherlands, Netherlands

Reviewed by:

Alexandre Morin, Concordia University, Canada
André Tricot, Epsylon Laboratory EA 4556, France

Copyright © 2021 Martin, Ginns, Burns, Kennett, Munro-Smith, Collie and Pearson. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Andrew J. Martin, andrew.martin@unsw.edu.au

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.