Your new experience awaits. Try the new design now and help us make it even better

ORIGINAL RESEARCH article

Front. Psychol., 21 January 2026

Sec. Health Psychology

Volume 16 - 2025 | https://doi.org/10.3389/fpsyg.2025.1739300

This article is part of the Research TopicWell-being and Cognitive Science in Higher Education: Measures and InterventionView all 19 articles

The impact of system interaction quality on learning outcomes in online virtual experiment teaching: the mediating role of extraneous cognitive load

  • 1Office of the President, Shandong Management University, Shandong, China
  • 2Department of Labor Relations, Shandong Management University, Shandong, China

This study investigates the effect of system interaction quality on learning outcomes in online virtual experiment teaching, with extraneous cognitive load serving as a mediating variable. Based on Cognitive Load Theory, a structural equation model was constructed to examine the relationships among user interface quality, communication quality, extraneous cognitive load, and learning outcomes. Using a cross-sectional questionnaire study, data collected from 610 valid samples were analyzed using SPSS 27.0 and AMOS 29.0. The results revealed that both user interface quality and communication quality significantly and positively predicted learning outcomes. Moreover, extraneous cognitive load partially mediated these relationships, indicating that high system interaction quality enhances learning outcomes not only directly but also indirectly by reducing unnecessary cognitive burdens. These findings extend the application of Cognitive Load Theory to virtual teaching contexts and provide empirical evidence for the “technology–cognition–learning” mechanism. Practically, the study offers actionable guidance for optimizing user interface design, improving communication performance, and enhancing instructional strategies to promote effective learning in online virtual experiment environments.

1 Introduction

Learning Outcome is a core indicator for measuring teaching effectiveness, a crucial reflection of educational quality, and a manifestation of the effectiveness of instructional system design and the implementation of teaching strategies (Hussey and Smith, 2008). As a multidimensional concept, learning outcome—from the perspective of educational objective taxonomy—encompass a progression from the memorization and comprehension of basic knowledge to application, analysis, synthesis, evaluation, and creation. They serve as fundamental criteria for verifying whether teaching activities have achieved their preset goals (Anderson and Krathwohl, 2001). Hussey and Smith (2008) defined learning outcome as verifiable learning achievements aligned with objectives, pointing out that learners' cognitive engagement and course quality are core influencing variables. Owing to the significance of learning outcome, existing studies have extensively explored this concept. Through a systematic review of studies conducted between 2009 and 2018, Martin et al. (2020) identified that teaching interaction, technical support, and learning design are key external factors affecting learning outcome. Additional studies have examined the predictive factors of learning outcomes in online courses, revealing that factors such as autonomous motivation, course design, and interaction with teachers and peers are associated with students' perceived learning outcomes (Wei et al., 2023).

Online virtual experiment teaching is a combination of online teaching and virtual technology (Li et al., 2018). Its application has become increasingly widespread in the global education field in recent years, and scholars have explored its learning outcome. In Virtual Experiment Teaching, learning outcome is specifically reflected in an in-depth understanding of experimental principles, proficiency in operating virtual instruments, and the ability to transfer problem-solving skills from simulated to real-world environments (Chang and Liu, 2025). Learning outcome not only embody the attainment of teaching objectives but also serve as a core basis for evaluating the quality of virtual experimental environment design (Nasrallah, 2014). Online virtual experiment teaching not only significantly reduces experimental costs and safety risks but also breaks the constraints of time, space, and resources, thereby enabling the widespread sharing of high-quality experimental teaching and enhancing both teaching efficiency and equity (Hamilton et al., 2021). However, the achievement of learning outcome in online virtual experiment teaching still faces numerous challenges. Many online virtual experiment teaching platforms are designed with a focus on the realization of technical functions, while neglecting learner-centered interaction experiences. Complex operation interfaces, unnatural feedback mechanisms, and unguided exploration processes increase learners' extraneous cognitive load, diverting their attention from understanding core concepts to coping with operational difficulties, thereby inhibiting deep learning (Mayer and Moreno, 2003). Therefore, enhancing the learning outcome of online virtual experiment teaching is crucial for guiding the development and iteration of online virtual experimental resources, as well as enhancing teaching quality and learning efficiency.

Although existing studies have discussed the impact of virtual technologies such as Augmented Reality (Dhar et al., 2021; Yusa et al., 2023) and Virtual Reality (Villena-Taranilla et al., 2022) on teaching effectiveness, most treat the technology itself as an independent variable rather than delving into which key features or mechanisms of the technology truly influence learning. Notably, the innovation of technical carriers does not equate to the success of online virtual experiment teaching. What matters more is whether teaching content, experimental resources, and other elements are effectively transmitted and transformed toward learning outcomes through technology. In online virtual experiment teaching, virtual experiment systems serve as the carrier of teaching. Learners can acquire knowledge through human-computer interaction or human-human interaction, both of which are conducted via virtual systems (Zhang et al., 2017). Against this backdrop, system interaction quality has become a common characteristic variable that transcends technical types and is inherent to all digital learning systems. Studies have indicated that system interaction is a key mediating link between learners and the virtual experimental environment. Its quality directly determines learners' information acquisition efficiency, operational experience, and depth of cognitive processing (Makransky and Mayer, 2022). Therefore, it exerts a significant impact on learning outcome. However, existing studies have not linked system interaction quality with learning outcome, nor have they conducted an in-depth analysis of the inherent influence mechanism between the two. Our study aims to investigate how system interaction quality affects learning outcome in online virtual experiment teaching and to uncover the underlying mechanisms, shifting the research focus from “technical effects” to “mechanism explanation.” This not only helps uncover the common mechanism of action behind different educational technologies but also provides a more universal theoretical basis for the design and optimization of educational technology systems.

To investigate the impact of system interaction quality on learning outcome and the underlying mechanism, this study incorporates the Cognitive Load Theory. First systematically proposed by Sweller (1988), the theory is based on the cognitive principle that human working memory has limited capacity. It categorizes cognitive load during learning into three types: Intrinsic Cognitive Load, Extraneous Cognitive Load, and Germane Cognitive Load (Sweller, 2011). Defects in the design of learning environments can trigger excessive extraneous cognitive load, which occupies limited working memory resources, thereby inhibiting knowledge encoding and transfer and ultimately reducing learning outcomes (Skulmowski and Xu, 2022). Additional studies have pointed out that optimizing the interaction design of learning environments can effectively reduce extraneous cognitive load. By alleviating cognitive burden, it creates conditions for deep learning, thereby significantly improving learning outcome (Klepsch and Seufert, 2020). In the context of online virtual experiment teaching, system interaction quality is directly related to learners' level of extraneous cognitive load, while the reasonable regulation of extraneous cognitive load affects the ultimate achievement of learning outcome. Therefore, this study attempts to further explore whether extraneous cognitive load plays a mediating role between system interaction quality and learning outcome in virtual experiment teaching, aiming to clarify the inherent correlation path among the three.

This study can reveal the mechanism of action between system interaction quality and learning outcome, helping educational technology designers and teachers to more effectively optimize system interaction design and improve learning performance. Based on this, the purpose of this study is to explore the impact of system interaction quality on learning outcome and verify the mediating role of extraneous cognitive load, thereby providing strong support for the optimization of educational informatization and intelligent learning environments. The main research questions of this study are:

1. Can system interaction quality significantly predict learning outcomes?

2. Can system interaction quality affect learning outcomes through the mediating role of extraneous cognitive load?

2 Theoretical background and research hypotheses

2.1 System interaction quality and learning outcomes

2.1.1 Learning outcomes (LO)

Learning outcomes—often referred to as learning performance, learning efficiency, or academic achievement—comprehensively reflect the knowledge, skills, and attitudes that learners acquire over a period of study. They serve as key indicators for evaluating educational quality (Hussey and Smith, 2008). Previous research has classified learning outcomes into three categories: cognitive outcomes, behavioral outcomes, and affective outcomes (Wei et al., 2023). Cognitive outcomes refer to the knowledge and intellectual skills acquired by learners; behavioral outcomes pertain to the degree of engagement in learning activities; and affective outcomes relate to learners' satisfaction with and perceptions of the course (Wei et al., 2021). In the context of online virtual experiment teaching, learning outcomes manifest in two main ways. On the one hand, 3D modeling and interactive operations enable learners to intuitively grasp abstract concepts and complex principles. On the other hand, virtual experiments offer opportunities for repeated practice, reinforcing understanding and skill acquisition (Chang and Liu, 2025).

2.1.2 System interaction quality (SIQ)

Interaction and communication among individuals constitute one of the fundamental characteristics of human society (Hier, 2005). In the information age, interactivity remains a key predictor of users' adoption of technologies and tools (Sun et al., 2024). System interaction quality was originally introduced as a component of the Information Systems Success Model, assessing users' interactive experience, fluency, and effectiveness during system use (Delone and McLean, 2003). In the educational context, virtual experiment teaching facilitates knowledge acquisition through the interactive use of computer simulations (Zhang et al., 2017). Notably, interaction among educational participants is a critical determinant of educational effectiveness. However, with the advancement of educational informatization and online teaching models, although learning convenience has improved, the interpersonal interaction characteristic of traditional classrooms has weakened (Xiao, 2017). Consequently, within the domain of modern educational technology, system interaction quality encompasses not only the quality of information exchange between learners and learning systems but also the extent of online communication among participants. It therefore comprises two dimensions: user interface quality and communication quality (Alhendawi and Baharudin, 2014). Among these, user interface quality (UIQ) primarily reflects the human–computer interaction experience between learners and the system, emphasizing factors such as intuitiveness and ease of use in interface design. Communication quality (CQ), on the other hand, represents the online interaction experience between learners and others (including instructors and peers), focusing on the timeliness and effectiveness of information exchange.

2.1.3 System interaction quality and learning outcomes

The positive impact of system interaction quality has been extensively validated across multiple dimensions. Research indicates that the quality of a system's interactive design is positively correlated with perceived usefulness and employee performance (Lin, 2010). Furthermore, factors such as resource availability, feedback, and communication significantly enhance user motivation and satisfaction (Lawson-Body et al., 2010), which are, in turn, regarded as precursors to improved learning outcomes. In online learning environments, interactivity serves as a strong predictor of students' satisfaction with the learning system and their intention to continue using it (Cheng, 2020). High-quality interactivity not only optimizes users' emotional experiences and sense of immersion but also indirectly facilitates knowledge construction and learning performance by enhancing learners' cognitive engagement and social presence (Makransky and Mayer, 2022). Additionally, studies have confirmed that both dimensions of interaction quality—user interface quality and communication quality—positively influence system effectiveness and contribute to greater user satisfaction (Alhendawi and Baharudin, 2014).

Therefore, this study posits that system interaction quality in online virtual experiment teaching positively influences learning outcomes. Considering that the two dimensions of system interaction quality represent two primary types of interaction—human–computer interaction and human–human interaction—this study examines the effects of user interface quality and communication quality on learning outcomes separately. Based on this rationale, the following hypotheses are proposed to address Research Question 1:

H1: The user interface quality of the system in online virtual experiment teaching positively predicts learning outcomes.

H2: The communication quality of the system in online virtual experiment teaching positively predicts learning outcomes.

2.2 The mediating role of cognitive load theory and external cognitive load

2.2.1 Cognitive Load Theory (CLT)

Cognitive Load Theory (CLT) was developed based on the structure of human cognition (Sweller, 2011), emphasizing the influence of the limited capacity of working memory (Cowan, 2014) on the learning process. When individuals encounter new information, it must first be processed through working memory—which has restricted capacity and duration—before being stored in long-term memory for future use (Chen et al., 2023). The theory highlights that the constraints of working memory are a key determinant of how effectively information can be presented (Haryana et al., 2022). Cognitive load refers to the amount of working memory resources expended by an individual when processing information during a learning task (Sweller, 2011).

One of the core concepts in CLT is element interactivity, which describes situations in which information consists of multiple interacting elements that must be processed simultaneously (Sweller, 2024). Based on the effects of element interactivity on working and long-term memory, cognitive load was originally categorized into three types: intrinsic, extraneous, and germane cognitive load. Intrinsic cognitive load refers to the working memory demands imposed by the inherent complexity of the learning content. Extraneous cognitive load represents the additional demands arising from the way information is presented or organized. Germane cognitive load, meanwhile, refers to the working memory resources required for learning itself (Sweller et al., 1998). Over the past two decades, considerable debate has emerged concerning the nature and validity of germane cognitive load (Skulmowski and Xu, 2022). In response, Sweller et al. (2019) proposed a revised version of CLT, replacing “germane cognitive load” with the term “germane processing” and excluding it as a constituent component of total cognitive load.

2.2.2 The mediating role of external cognitive load (ECL)

Extraneous cognitive load pertains to the way information is presented and processed (Sweller et al., 1998). Hollender et al. (2010) defined it as the combined cognitive load generated by instructional design and software usability. Specifically, it refers to the cognitive burden that is unrelated to learning objectives and arises from poorly designed instruction or inappropriate methods of information presentation (Sweller, 2011). In online virtual teaching environments, extraneous cognitive load is influenced not only by instructions and explanations but also by interactive technologies and the broader learning environment. Accordingly, in online virtual experiment teaching, extraneous cognitive load can be classified into three dimensions: extraneous load instruction, extraneous load interaction, and extraneous load environment (Andersen and Makransky, 2021).

Factors such as the format of task presentation or the learning environment can serve as sources of extraneous cognitive load (Schnotz and Kürschner, 2007). A key feature of virtual experiment teaching is its reliance on system interactions to deliver information (Haryana et al., 2022). In this context, the quality of system interaction may act as a presentation format for learning tasks, potentially inducing extraneous cognitive load in learners. Learners' perception and processing of new information or knowledge vary depending on the medium through which the information is transmitted (Daghestani et al., 2012). Previous studies have shown that when a system interface is poorly designed, learners must expend additional cognitive resources to understand the interface logic, locate relevant information, or process redundant content, thereby increasing extraneous cognitive load (Makransky and Mayer, 2022). Similarly, communication barriers—such as ineffective interaction or delayed feedback during the learning process—can also heighten extraneous cognitive load (Costley and Fanguy, 2021). Therefore, both the user interface quality and communication quality of a system may increase extraneous cognitive load. Since extraneous cognitive load does not facilitate knowledge construction but instead consumes learners' limited cognitive resources, it can hinder deep processing and comprehension (Skulmowski and Xu, 2022). Prior research has confirmed that excessive extraneous load reduces instructional effectiveness; thus, instructional design should aim to minimize or eliminate this type of cognitive load (Sweller, 2023). Consequently, this study posits that system interaction quality in online virtual experiment teaching indirectly influences learning outcomes through extraneous cognitive load. Based on this rationale, the following hypotheses are proposed to address Research Question 2:

H3: Extraneous cognitive load mediates the effect of communication quality on learning outcomes.

H4: Extraneous cognitive load mediates the effect of user interface quality on learning outcomes.

Based on the theoretical background and the hypotheses outlined above, this study develops a hypothesized model illustrating the relationships among system interaction quality, extraneous cognitive load, and learning outcomes, as shown in Figure 1.

Figure 1
Flowchart with five labeled ovals connected by arrows. “SIQ” connects to “CQ” and “UIQ”, both leading to “ECL”. “ECL” connects to “LO”, which loops back to “CQ” and “SIQ”.

Figure 1. Research model. SIQ, System Interaction Quality; CQ, Communication Quality; UIQ, User Interface Quality; ECL, Extraneous Cognitive Load; LO, Learning Outcome.

3 Methods

3.1 Participants and procedure

This study adopted a convenience sampling method. In October 2025, an online survey was conducted using the Questionnaire Star platform as the data collection tool. The participants were students from higher education institutions in Shandong Province who had participated in online virtual experiment teaching during the current semester. After reading the informed consent form, participants could voluntarily choose whether to access the survey link and complete the questionnaire. A total of 678 questionnaires were collected during the survey period. To ensure the quality of the data, responses with logical inconsistencies, patterned answering, or excessively short completion times were excluded. Ultimately, 610 valid questionnaires remained, yielding an effective response rate of 89.9%. All participants were current university students, with a mean age of 20.1 years. Among them, 321 were from undergraduate institutions and 289 from vocational colleges, accounting for 52.6% and 47.4%, respectively. Regarding academic disciplines, 189 students (31%) were from natural sciences, 246 (40.3%) from engineering sciences, 124 (20.3%) from medical sciences, and 51 (8.4%) from humanities and social sciences. In terms of gender, there were 355 male students (58.2%) and 255 female students (41.8%). Details are shown in Table 1.

Table 1
www.frontiersin.org

Table 1. General demographic characteristics (n = 610).

3.2 Measures

3.2.1 Learning Outcome (LO)

This study employed the Perceived Learning outcome Scale developed by DiLoreto et al. (2022) to measure learning outcome. The scale consists of four items (e.g., “The learning tasks deepened my understanding of the course content”). A six-point Likert scale (1 = “strongly disagree,” 6 = “strongly agree”) was used, and higher scores indicate higher levels of perceived learning outcome. The Cronbach's α coefficient for this scale was 0.793. Confirmatory factor analysis (CFA) showed that the model fit indices (χ2/df = 2.101, RMSEA = 0.043, CFI = 0.997, GFI = 0.997, NFI = 0.994) met the standard criteria for good model fit, indicating that the scale demonstrated satisfactory structural validity.

3.2.2 System Interaction Quality (SIQ)

The System Interaction Quality was measured with the 10-item scale developed by Alhendawi and Baharudin (2014). It includes two dimensions: Communication Quality (UIQ, 5 items, e.g., “The system provides discussion boards”) and User Interface Quality (CQ, 5 items, e.g., “The design of system services facilitates access”). All items were rated on a 7-point Likert scale (1 = “strongly disagree”, 7 = “strongly agree”), and higher scores indicating better system interaction quality. The scale demonstrated a Cronbach's α coefficient of 0.832. Confirmatory factor analysis revealed that the structural fit indices of the scale (χ2/df = 1.444, RMSEA = 0.027, CFI = 0.993, GFI = 0.986, NFI = 0.979) met the recommended criteria, indicating good structural validity of the scale.

3.2.3 Extraneous Cognitive Load (ECL)

The Extraneous Cognitive Load was measured with the 11-item scale developed Andersen and Makransky (2021). It includes three dimensions: extraneous load instruction (3 items, e.g., “The instructions and/or explanations used during the simulation were very unclear”), extraneous load interaction (4 items, e.g., “The interactive technology used in the simulation made learning more difficult”), and extraneous load environment (4 items, e.g., “Various elements in the virtual environment made learning very confusing”). All items were rated on a 9-point Likert scale (1 = “strongly disagree,” 7 = “strongly agree”), and higher scores indicating a higher level of extraneous cognitive load. The scale demonstrated a Cronbach's α coefficient of 0.944. Confirmatory factor analysis revealed that the structural fit indices of the scale (χ2/df = 1.363, RMSEA = 0.024, CFI = 0.997, GFI = 0.984, NFI = 0.988) met the recommended criteria, indicating good structural validity of the scale.

3.2.4 Analytical strategy

This study employed SPSS 27.0 to conduct descriptive analysis, correlation analysis, and common method bias tests on the data. AMOS 29.0 was used to perform confirmatory factor analysis and establish a structural equation model to investigate the impact of system interaction quality on learning outcome in online virtual experiment teaching. The Bootstrap method was applied to examine the mediating effect of extraneous cognitive load between these two variables.

4 Results

4.1 Common method biases

To minimize common method bias in the data, we first implemented controls in the survey procedure, such as ensuring anonymity and confidentiality, including reverse-scored items, avoiding ambiguous or leading language, and using different numbers of Likert scale points. Secondly, Harman's single-factor test was employed to assess common method bias in the collected data. An unrotated exploratory factor analysis of all items yielded a KMO value of 0.949 (p < 0.001), indicating suitability for factor analysis. The first factor accounted for 36.286% of the variance, which is below the critical threshold of 40%, suggesting no significant common method bias (Podsakoff et al., 2003).

4.2 Descriptive statistics and correlations

Descriptive statistics and Pearson correlation analysis were conducted on the data, and the results are presented in Table 2. Significant correlations (p < 0.01) were observed among the four core variables: user interface quality showed significant positive correlations with communication quality (r = 0.300) and learning outcome (r = 0.399), and a significant negative correlation with extraneous cognitive load (r = −0.389). Communication quality was significantly positively correlated with learning outcome (r = 0.415) and significantly negatively correlated with extraneous cognitive load (r = −0.317). Extraneous cognitive load demonstrated a significant negative correlation with learning outcome (r = −0.393). The correlations among the variables provide preliminary support for subsequent hypothesis testing.

Table 2
www.frontiersin.org

Table 2. Descriptive statistics, correlations and AVE values.

4.3 Convergent and discriminant validity tests

The tests revealed that all standardized factor loadings (β) for the measurement items exceeded 0.6 (Costello and Osborne, 2005), with values ranging from 0.682 to 0.769 for user interface quality, 0.672 to 0.759 for communication quality, 0.915 to 0.934 for extraneous cognitive load, and 0.653 to 0.739 for learning outcome. All loadings were statistically significant (p < 0.001), indicating strong associations and representativeness between the measurement items and their respective constructs. The calculation of average variance extracted (AVE) and composite reliability (CR) showed that only the AVE for learning outcome was slightly below 0.5, though values above 0.4 are generally considered acceptable (Verhoef et al., 2002). All constructs demonstrated CR values exceeding 0.7, confirming high internal consistency and convergent validity (Fornell and Larcker, 1981). Detailed results are presented in Table 3.

Table 3
www.frontiersin.org

Table 3. Results of convergent validity.

Discriminant validity among the constructs was tested, with the results shown in Table 2. The values on the diagonal represent the square roots of the AVE for each variable, while the off-diagonal values indicate the correlation coefficients between variables. The results demonstrate that all correlation coefficients between constructs were significantly lower than the square roots of the AVE in their respective rows and columns. This indicates good discriminant validity (Fornell and Larcker, 1981).

4.4 Hypothesis testing and path analysis

To test the aforementioned hypotheses, a structural equation model was constructed with communication quality and user interface quality as independent variables, learning outcome as the dependent variable, and extraneous cognitive load as the mediating variable, while controlling for the demographic variables of gender and major. The model structure is shown in Figure 2. The results of the confirmatory factor analysis for the model were as follows: χ2/df = 1.165, RMSEA = 0.016, NFI = 0.975, CFI = 0.996, TLI = 0.996, indicating a good model fit (Kline, 2023).

Figure 2
Diagram illustrating a structural equation model with latent variables CQ, UIQ, ECL, and LO. CQ and UIQ influence ECL, which then influences LO. CQ consists of CQ1 to CQ5, UIQ includes UIQ1 to UIQ5, and each has factor loadings. ECL is influenced by ECL1 to ECL3, with high loadings. LO is reflected by LO1 to LO4, showing strong associations. Paths have coefficients indicating relationships among variables.

Figure 2. Structural equation model. SIQ, System Interaction Quality; CQ, Communication Quality; UIQ, User Interface Quality; ECL, Extraneous Cognitive Load; LO, Learning Outcome.

To further examine the role of the mediating variable, the bias-corrected percentile bootstrap method was employed to test the significance of the mediating effects. Using a 95% confidence interval, the data were resampled 5,000 times. Significance was indicated if the confidence interval did not include 0. The test results are presented in Table 4.

Table 4
www.frontiersin.org

Table 4. Bootstrap mediation effect.

Regarding the total effect of system interaction quality on learning outcome, both communication quality (β = 0.351, p <0.001) and user interface quality (β = 0.379, p < 0.001) had significant direct effects on learning outcome when the mediating variable was not included, supporting Hypotheses H1 and H2.

After introducing extraneous cognitive load as a mediating variable, the effect of communication quality on extraneous cognitive load was −0.353 (p <0.001), with a 95% confidence interval (CI) of [−0.432, −0.270], as the CI did not include 0, indicating that communication quality significantly reduces extraneous cognitive load. The effect of extraneous cognitive load on learning outcome was −0.212 (p < 0.001), with a 95% CI of [−0.299, −0.117], indicating that extraneous cognitive load significantly decreases learning outcome. When extraneous cognitive load was included as a mediator, the indirect path from communication quality → extraneous cognitive load → learning outcome had an effect of 0.075 (p < 0.001), with a 95% CI of [0.042, 0.115], as the CI did not include 0. Furthermore, after including the mediator, the direct effect of communication quality on learning outcome remained significant, with an effect of 0.276 (p <0.001) and a 95% CI of [0.178, 0.371]. These results indicate that extraneous cognitive load plays a significant partial mediating role between communication quality and learning outcome. The proportions of the direct and indirect effects were 78.7% and 21.3%, respectively, supporting Hypothesis H3.

After introducing extraneous cognitive load as a mediating variable, the effect of user interface quality on extraneous cognitive load was −0.231 (p <0.001), with a 95% confidence interval (CI) of [−0.315, −0.139], as the CI did not include 0, indicating that user interface quality significantly reduces extraneous cognitive load. The indirect path from user interface quality → extraneous cognitive load → learning outcome had an effect of 0.049 (p < 0.001), with a 95% CI of [0.025, 0.082], as the CI did not include 0. Furthermore, after including the mediator, the direct effect of user interface quality on learning outcome remained significant, with an effect of 0.331 (p <0.001) and a 95% CI of [0.229, 0.420]. These results indicate that extraneous cognitive load plays a significant partial mediating role between user interface quality and learning outcome. The proportions of the direct and indirect effects were 87.1% and 12.9%, respectively, supporting Hypothesis H4.

5 Discussion

This study aimed to investigate how system interaction quality affects learning outcomes in online virtual experiment teaching, with extraneous cognitive load functioning as a mediating mechanism. Grounded in Cognitive Load Theory, the study focused on two key aspects of system interaction quality—user interface quality and communication quality—and examined their respective impacts on learners' performance. The following discussion interprets the empirical findings and elaborates on their theoretical and practical implications within the context of online virtual experiment learning environments.

First, both communication quality and user interface quality, as components of system interaction quality, significantly and positively predict learning outcomes.

From the perspective of user interface quality, this result aligns with previous research suggesting that the ease of use of digital tools reduces technological anxiety and enhances teaching quality (Ndirangu and Udoto, 2011; Stefanovic and Klochkova, 2021); it also supports the view that the adaptability of digital technology promotes teaching effectiveness (Fütterer et al., 2023). User interface quality not only affects the convenience of learning operations but also influences learners' emotional experiences and self-efficacy (Bollini, 2017). Moreover, the aesthetic appeal and logical design of the interface can enhance learners' trust and sense of control over the system, thereby stimulating their learning motivation and independent exploration behaviors. This, in turn, leads to higher levels of concentration and engagement during virtual experiments (Hui and See, 2015).

From the perspective of communication quality, previous research has identified interaction as crucial for learning outcome and satisfaction (Tsang et al., 2021; Zhang et al., 2017). Our results confirm that interaction remains equally important in online virtual experiment teaching, with communication quality serving as the foundation for effective interaction in digital environments. Considering the unique requirements of online virtual experiment teaching scenarios, learners' ability to deepen knowledge comprehension and master experimental skills depends on reliable communication quality. This quality facilitates real-time teacher-student Q&A and peer collaborative feedback, enabling timely and effective interactive support (Quadir et al., 2022). If the system's communication quality is compromised, it directly undermines the online interaction experience. Delays in resolving learning queries may trigger negative emotions such as frustration and discouragement among learners, ultimately reducing their engagement and concentration (Rubio-Tamayo et al., 2017; Shang et al., 2022).

Unlike previous studies, this study does not focus on a single technological platform but instead adopts a structural perspective on system interaction quality, allowing a comprehensive examination of the roles of two distinct interaction dimensions. This multidimensional analysis addresses the limitation of prior studies that treated “interaction” as a unitary concept, providing a clearer theoretical delineation of system interaction quality. The results demonstrate that both interface quality at the human-computer interaction level and communication quality at the interpersonal level significantly enhance learning outcome.

Second, system interaction quality enhances learning outcome by reducing extraneous cognitive load.

This study further reveals that both user interface quality and communication quality not only directly improve learning outcome but also indirectly improve them by reducing extraneous cognitive load. This finding elucidates the psychological mechanism through which interaction quality influences learning outcome from a cognitive processing perspective, aligning with the core premise of cognitive load theory (Sweller, 2024; Sweller et al., 1998): learning system design should minimize cognitive burdens unrelated to learning objectives, enabling learners to allocate their limited cognitive resources to core learning tasks. Given learners' finite cognitive resources, extraneous cognitive load acts as an “inefficient cognitive consumption.” Excessive extraneous cognitive load can divert resources away from essential learning tasks, leading to diminished learning outcome (Skulmowski and Xu, 2022). Cognitive load theory posits that extraneous cognitive load can be regulated through environmental design. When extraneous cognitive load is effectively managed via system interaction quality, learners' cognitive resources are optimally allocated, facilitating deeper learning and thereby improving learning outcome (Klepsch and Seufert, 2020). This further validates the mediating role of extraneous cognitive load as a “regulator of cognitive resource allocation.”

The results of this study address the research gap concerning the relationship between system interaction quality, extraneous cognitive load, and learning outcome. It shifts the understanding of system interaction quality from a technical perspective to a cognitive dimension, while simultaneously validating the mediating pathway from system interaction quality through extraneous cognitive load to learning outcome. This not only extends the applicability of cognitive load theory in virtual teaching contexts but also highlights the critical role of technical experience variables in cognitive processing. These insights provide a theoretical foundation for future research on the technology-cognition-learning mechanism.

Third, unlike previous studies that focused on specific technology types, this research adopts a system interaction quality perspective to explore universal mechanisms across different technologies. The main contributions of this study are as follows:

At the theoretical level, this study reveals the psychological mechanism through which system interaction quality influences learning outcome via extraneous cognitive load, thus extending the application of cognitive load theory to online virtual teaching. At the empirical level, this study categorizes system interaction quality into two dimensions—user interface quality and communication quality—and constructs and validates a model linking system interaction quality, extraneous cognitive load, and learning outcome. This uncovers the distinct pathways through which different interaction dimensions operate, providing a new measurement and analytical framework for related research.

At the practical level, the findings offer actionable directions for optimizing interface design, communication performance, and instructional activities in online virtual teaching systems, thereby enhancing online learning experiences and improving learning outcomes. Firstly, developers should prioritize reducing extraneous cognitive load by optimizing interface information architecture, minimizing redundant animations and complex operations, and incorporating intelligent design features (such as automatic prompts and task progress guidance) to reduce learners' cognitive and operational load (Yan et al., 2021). For instance, the learning management system (LMS) developed by Suryani et al. (2024) incorporates an adaptive interface design that dynamically captures learners' cognitive states—including prior knowledge, working memory capacity, and perceived task complexity—and uses this data to personalize interface elements in real time. Secondly, the performance and interaction quality of communication modules should be enhanced to ensure smooth information flow and immediate feedback during the learning process, thereby preventing cognitive disruptions caused by delays, lags, or misunderstandings (Skowronek and Raake, 2015). As an example, Kerimbayev et al. (2020) constructed a multi-layered, multimodal interactive virtual learning environment based on the Moodle platform, integrating teacher–student, student–student, and human–computer interactions, while effectively combining synchronous and asynchronous communication modes. At the teacher level, it is essential to appropriately pace online interactions and avoid excessive communication and information overload, to help students maintain focus on core learning content (Xie et al., 2023).

6 Conclusion

This study, grounded in Cognitive Load Theory, constructed a structural equation model with two sub dimensions of system interaction quality—communication quality and user interface quality—as independent variables, extraneous cognitive load as the mediating variable, and learning outcomes as the dependent variable. The results demonstrate that system interaction quality plays a crucial role in shaping learning outcomes in online virtual experiment teaching, with both user interface quality and communication quality significantly enhancing learning performance. Furthermore, extraneous cognitive load partially mediates the relationship between system interaction quality and learning outcomes. This research extends the application of Cognitive Load Theory to the field of online virtual teaching and provides practical guidance for improving interface design, optimizing communication, and designing instructional activities in online virtual teaching systems.

7 Limitations and future directions

Despite its valuable contributions, this study has several limitations.

First, the data in this study were collected via self-reported questionnaires. Although certain procedural controls were implemented, common method bias and the influence of subjective perceptions cannot be entirely ruled out. Future research could incorporate objective measures such as test performance or physiological data. For instance, physiological measures (e.g., eye-tracking, heart rate variability) or experimental task performance (e.g., immediate test scores) could be used to cross-validate self-reported responses, while multi-source data triangulation (e.g., combining teacher evaluations, student self-reports, and system logs) could enhance construct validity.

Second, the cross-sectional design restricts causal interpretation. To enhance causal inference, future research could adopt the following designs: a longitudinal approach, measuring system interaction quality, extraneous cognitive load, and learning outcomes at multiple time points to analyze temporal relationships through cross-lagged panel modeling; and an experimental approach, in which students are randomly assigned to virtual learning environments with either high or low interface/communication quality while controlling for other variables. This would allow direct observation of changes in extraneous cognitive load and learning outcomes, thereby helping to establish a causal chain.

Third, this study focused on system-level and cognitive-level variables and did not include individual learner characteristics as predictors in the model. The absence of these individual factors implies that potential confounding effects cannot be fully discounted. Future studies could incorporate learner characteristics—such as prior knowledge, technical proficiency, or working memory capacity—to provide a more comprehensive understanding of the interplay between system design and learner differences in virtual learning environments.

Data availability statement

The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.

Author contributions

PY: Investigation, Data curation, Writing – review & editing, Writing – original draft. TS: Writing – original draft, Conceptualization, Writing – review & editing, Methodology.

Funding

The author(s) declared that financial support was not received for this work and/or its publication.

Conflict of interest

The author(s) declared that this work was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Generative AI statement

The author(s) declared that generative AI was not used in the creation of this manuscript.

Any alternative text (alt text) provided alongside figures in this article has been generated by Frontiers with the support of artificial intelligence and reasonable efforts have been made to ensure accuracy, including review by the authors wherever possible. If you identify any issues, please contact us.

Publisher's note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

References

Alhendawi, K. M., and Baharudin, A. S. (2014). The impact of interaction quality factors on the effectiveness of Web-based information system: the mediating role of user satisfaction. Cognit. Technol. Work 16, 451–465. doi: 10.1007/s10111-013-0272-9

Crossref Full Text | Google Scholar

Andersen, M. S., and Makransky, G. (2021). The validation and further development of a multidimensional cognitive load scale for virtual environments. J. Comput. Assist. Learn. 37, 183–196. doi: 10.1111/jcal.12478

Crossref Full Text | Google Scholar

Anderson, L. W., and Krathwohl, D. R. (2001). A taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom's Taxonomy of Educational Objectives : complete edition. Boston, MA: Addison Wesley Longman, Inc. https://eduq.info/xmlui/handle/11515/18824

Google Scholar

Bollini, L. (2017). Beautiful interfaces. From user experience to user interface design. Design J. 20, S89–S101. doi: 10.1080/14606925.2017.1352649

Crossref Full Text | Google Scholar

Chang, J., and Liu, D. (2025). Optimising learning outcomes: a comprehensive approach to virtual simulation experiment teaching in higher education. Int. J. Hum. Comput. Interact. 41, 2114–2134. doi: 10.1080/10447318.2024.2314825

Crossref Full Text | Google Scholar

Chen, O., Paas, F., and Sweller, J. (2023). A cognitive load theory approach to defining and measuring task complexity through element interactivity. Educ. Psychol. Rev. 35:63. doi: 10.1007/s10648-023-09782-w

Crossref Full Text | Google Scholar

Cheng, Y.-M. (2020). Students' satisfaction and continuance intention of the cloud-based e-learning system: roles of interactivity and course quality factors. Educ. Train. 62, 1037–1059. doi: 10.1108/ET-10-2019-0245

Crossref Full Text | Google Scholar

Costello, A. B., and Osborne, J. (2005). Best practices in exploratory factor analysis: four recommendations for getting the most from your analysis. Pract. Assess. Res. Eval. 10. 1-9. doi: 10.7275/jyj1-4868

Crossref Full Text | Google Scholar

Costley, J., and Fanguy, M. (2021). Collaborative note-taking affects cognitive load: The interplay of completeness and interaction. Educ. Technol. Res. Dev. 69, 655–671. doi: 10.1007/s11423-021-09979-2

Crossref Full Text | Google Scholar

Cowan, N. (2014). Working memory underpins cognitive development, learning, and education. Educ. Psychol. Rev. 26, 197–223. doi: 10.1007/s10648-013-9246-y

PubMed Abstract | Crossref Full Text | Google Scholar

Daghestani, L., Hana, A.-N., Xu, Z., and Ragab, A. H. M. (2012). Interactive virtual reality cognitive learning model forperformance evaluation of math manipulatives. J. King Abdulaziz Univ. Comput. Inform. Technol. Sci. 1, 31–52. doi: 10.4197/Comp.1-1.2

Crossref Full Text | Google Scholar

Delone, W. H., and McLean, E. R. (2003). The delone and mcLean model of information systems success: a ten-year update. J. Manag. Inform. Syst. 19, 9–30. doi: 10.1080/07421222.2003.11045748

Crossref Full Text | Google Scholar

Dhar, P., Rocks, T., Samarasinghe, R. M., Stephenson, G., and Smith, C. (2021). Augmented reality in medical education: students' experiences and learning outcomes. Med. Educ. Online 26:1953953. doi: 10.1080/10872981.2021.1953953

PubMed Abstract | Crossref Full Text | Google Scholar

DiLoreto, M., Gray, J. A., and Schutts, J. (2022). Student satisfaction and perceived learning in online learning environments: an instrument development and validation study. Educ. Leadersh. Rev. 23, 115–134.

Google Scholar

Fornell, C., and Larcker, D. F. (1981). Evaluating structural equation models with unobservable variables and measurement error. J. Market. Res. 18, 39–50. doi: 10.1177/002224378101800104

Crossref Full Text | Google Scholar

Fütterer, T., Hoch, E., Lachner, A., Scheiter, K., and Stürmer, K. (2023). High-quality digital distance teaching during COVID-19 school closures: does familiarity with technology matter? Comput. Educ. 199:104788. doi: 10.1016/j.compedu.2023.104788

PubMed Abstract | Crossref Full Text | Google Scholar

Hamilton, D., McKechnie, J., Edgerton, E., and Wilson, C. (2021). Immersive virtual reality as a pedagogical tool in education: a systematic literature review of quantitative learning outcomes and experimental design. J. Comput. Educ. 8, 1–32. doi: 10.1007/s40692-020-00169-2

Crossref Full Text | Google Scholar

Haryana, M. R. A., Warsono, S., Achjari, D., and Nahartyo, E. (2022). Virtual reality learning media with innovative learning materials to enhance individual learning outcomes based on cognitive load theory. Int. J. Manag. Educ. 20:100657. doi: 10.1016/j.ijme.2022.100657

Crossref Full Text | Google Scholar

Hier, S. P. (2005). Contemporary sociological Thought: Themes and Theories. Toronto, ON: Canadian Scholars' Press.

Google Scholar

Hollender, N., Hofmann, C., Deneke, M., and Schmitz, B. (2010). Integrating cognitive load theory and concepts of human–computer interaction. Comput. Human Behav. 26, 1278–1288. doi: 10.1016/j.chb.2010.05.031

Crossref Full Text | Google Scholar

Hui, S. L. T., and See, S. L. (2015). Enhancing user experience through customisation of UI design. Proced. Manuf. 3, 1932–1937. doi: 10.1016/j.promfg.2015.07.237

Crossref Full Text | Google Scholar

Hussey, T., and Smith, P. (2008). Learning outcomes: a conceptual analysis. Teach. High. Educ. 13, 107–115. doi: 10.1080/13562510701794159

Crossref Full Text | Google Scholar

Kerimbayev, N., Nurym, N., Akramova, A., and Abdykarimova, S. (2020). Virtual educational environment: interactive communication using LMS Moodle. Educ. Inf. Technol. 25, 1965–1982. doi: 10.1007/s10639-019-10067-5

Crossref Full Text | Google Scholar

Klepsch, M., and Seufert, T. (2020). Understanding instructional design effects by differentiated measurement of intrinsic, extraneous, and germane cognitive load. Instr. Sci. 48, 45–77. doi: 10.1007/s11251-020-09502-9

Crossref Full Text | Google Scholar

Kline, R. B. (2023). Principles and Practice of Structural Equation Modeling. New York, NY: Guilford Publications.

Google Scholar

Lawson-Body, A., Willoughby, L., and Logossah, K. (2010). Developing an instrument for measuring e-commerce dimensions. J. Comput. Inform. Syst. 51, 2–13. doi: 10.1080/08874417.2010.11645463

Crossref Full Text | Google Scholar

Li, L., Chen, Y., Li, Z., Li, D., Li, F., and Huang, H. (2018). “Online virtual experiment teaching platform for database technology and application,” in 2018 13th International Conference on Computer Science and Education (ICCSE), 1–5. doi: 10.1109/ICCSE.2018.8468849

Crossref Full Text | Google Scholar

Lin, H.-F. (2010). An investigation into the effects of IS quality and top management support on ERP system usage. Total Qual. Manag. Bus. Excell. 21, 335–349. doi: 10.1080/14783360903561761

Crossref Full Text | Google Scholar

Makransky, G., and Mayer, R. E. (2022). Benefits of taking a virtual field trip in immersive virtual reality: evidence for the immersion principle in multimedia learning. Educ. Psychol. Rev. 34, 1771–1798. doi: 10.1007/s10648-022-09675-4

PubMed Abstract | Crossref Full Text | Google Scholar

Martin, F., Sun, T., and Westine, C. D. (2020). A systematic review of research on online teaching and learning from 2009 to 2018. Comput. Educ. 159:104009. doi: 10.1016/j.compedu.2020.104009

PubMed Abstract | Crossref Full Text | Google Scholar

Mayer, R. E., and Moreno, R. (2003). Nine ways to reduce cognitive load in multimedia learning. Educ. Psychol. 38, 43–52. doi: 10.1207/S15326985EP3801_6

Crossref Full Text | Google Scholar

Nasrallah, R. (2014). Learning outcomes' role in higher education teaching. Educ. Bus. Soc. Contemp. Middle East. Issues 7, 257–276. doi: 10.1108/EBS-03-2014-0016

Crossref Full Text | Google Scholar

Ndirangu, M., and Udoto, M. O. (2011). Quality of learning facilities and learning environment: Challenges for teaching and learning in Kenya's public universities. Qual. Assur. Educ. 19, 208–223. doi: 10.1108/09684881111158036

Crossref Full Text | Google Scholar

Podsakoff, P. M., MacKenzie, S. B., Lee, J.-Y., and Podsakoff, N. P. (2003). Common method biases in behavioral research: a critical review of the literature and recommended remedies. J. Appl. Psychol. 88, 879–903. doi: 10.1037/0021-9010.88.5.879

PubMed Abstract | Crossref Full Text | Google Scholar

Quadir, B., Yang, J. C., and Chen, N.-S. (2022). The effects of interaction types on learning outcomes in a blog-based interactive learning environment. Interact. Learn. Environ. 30, 293–306. doi: 10.1080/10494820.2019.1652835

Crossref Full Text | Google Scholar

Rubio-Tamayo, J. L., Gertrudix Barrio, M., and García García, F. (2017). Immersive environments and virtual reality: systematic review and advances in communication, interaction and simulation. Multimodal Technol. Interact. 1:21. doi: 10.3390/mti1040021

Crossref Full Text | Google Scholar

Schnotz, W., and Kürschner, C. (2007). A reconsideration of cognitive load theory. Educ. Psychol. Rev. 19, 469–508. doi: 10.1007/s10648-007-9053-4

Crossref Full Text | Google Scholar

Shang, H., Sivaparthipan, C. B., and ThanjaiVadivel. (2022). Interactive teaching using human-machine interaction for higher education systems. Comput. Electr. Eng. 100:107811. doi: 10.1016/j.compeleceng.2022.107811

Crossref Full Text | Google Scholar

Skowronek, J., and Raake, A. (2015). Assessment of cognitive load, speech communication quality and quality of experience for spatial and non-spatial audio conferencing calls. Speech Commun. 66, 154–175. doi: 10.1016/j.specom.2014.10.003

Crossref Full Text | Google Scholar

Skulmowski, A., and Xu, K. M. (2022). Understanding cognitive load in digital and online learning: a new perspective on extraneous cognitive load. Educ. Psychol. Rev. 34, 171–196. doi: 10.1007/s10648-021-09624-7

Crossref Full Text | Google Scholar

Stefanovic, S., and Klochkova, E. (2021). Digitalisation of teaching and learning as a tool for increasing students' satisfaction and educational efficiency: using smart platforms in EFL. Sustainability 13:4892. doi: 10.3390/su13094892

Crossref Full Text | Google Scholar

Sun, S., Jiang, L., and Zhou, Y. (2024). Associations between perceived usefulness and willingness to use smart healthcare devices among Chinese older adults: the multiple mediating effect of technology interactivity and technology anxiety. Digital Health 10:20552076241254194. doi: 10.1177/20552076241254194

PubMed Abstract | Crossref Full Text | Google Scholar

Suryani, M., Sensuse, D. I., Santoso, H. B., Aji, R. F., Hadi, S., Suryono, R. R., et al. (2024). An initial user model design for adaptive interface development in learning management system based on cognitive load. Cogn. Technol. Work 26, 653–672. doi: 10.1007/s10111-024-00772-8

Crossref Full Text | Google Scholar

Sweller, J. (1988). Cognitive load during problem solving: effects on learning. Cogn. Sci. 12, 257–285. doi: 10.1207/s15516709cog1202_4

Crossref Full Text | Google Scholar

Sweller, J. (2011). Chapter 2 – “Cognitive load theory,” in Psychology of Learning and Motivation,eds. J. P. Mestre and B. H. Ross (Cambridge, MA: Academic Press), 55, 37–76. doi: 10.1016/B978-0-12-387691-1.00002-8

Crossref Full Text | Google Scholar

Sweller, J. (2023). The development of cognitive load theory: replication crises and incorporation of other theories can lead to theory expansion. Educ. Psychol. Rev. 35:95. doi: 10.1007/s10648-023-09817-2

Crossref Full Text | Google Scholar

Sweller, J. (2024). Cognitive load theory and individual differences. Learn. Individ. Differ. 110:102423. doi: 10.1016/j.lindif.2024.102423

Crossref Full Text | Google Scholar

Sweller, J., van Merriënboer, J. J. G., and Paas, F. (2019). Cognitive architecture and instructional design: 20 years later. Educ. Psychol. Rev. 31, 261–292. doi: 10.1007/s10648-019-09465-5

Crossref Full Text | Google Scholar

Sweller, J., van Merrienboer, J. J. G., and Paas, F. G. W. C. (1998). Cognitive architecture and instructional design. Educ. Psychol. Rev. 10, 251–296. doi: 10.1023/A:1022193728205

Crossref Full Text | Google Scholar

Tsang, J. T. Y., So, M. K. P., Chong, A. C. Y., Lam, B. S. Y., and Chu, A. M. Y. (2021). Higher Education during the Pandemic: the predictive factors of learning effectiveness in COVID-19 Online Learning. Educ. Sci. 11:446. doi: 10.3390/educsci11080446

Crossref Full Text | Google Scholar

Verhoef, P. C., Franses, P. H., and Hoekstra, J. C. (2002). The effect of relational constructs on customer referrals and number of services purchased from a multiservice provider: does age of relationship matter? J. Acad. Mark. Sci. 30, 202–216. doi: 10.1177/00970302030003002

Crossref Full Text | Google Scholar

Villena-Taranilla, R., Tirado-Olivares, S., Cózar-Gutiérrez, R., and González-Calero, J. A. (2022). Effects of virtual reality on learning outcomes in K-6 education: a meta-analysis. Educ. Res. Rev. 35:100434. doi: 10.1016/j.edurev.2022.100434

Crossref Full Text | Google Scholar

Wei, X., Saab, N., and Admiraal, W. (2021). Assessment of cognitive, behavioral, and affective learning outcomes in massive open online courses: a systematic literature review. Comput. Educ. 163:104097. doi: 10.1016/j.compedu.2020.104097

Crossref Full Text | Google Scholar

Wei, X., Saab, N., and Admiraal, W. (2023). Do learners share the same perceived learning outcomes in MOOCs? Identifying the role of motivation, perceived learning support, learning engagement, and self-regulated learning strategies. Int. High. Educ. 56:100880. doi: 10.1016/j.iheduc.2022.100880

Crossref Full Text | Google Scholar

Xiao, J. (2017). Learner-content interaction in distance education: The weakest link in interaction research. Dist. Educ. 38, 123–135. doi: 10.1080/01587919.2017.1298982

Crossref Full Text | Google Scholar

Xie, Y., Huang, Y., Luo, W., Bai, Y., Qiu, Y., and Ouyang, Z. (2023). Design and effects of the teacher-student interaction model in the online learning spaces. J. Comput. High. Educ. 35, 69–90. doi: 10.1007/s12528-022-09348-9

PubMed Abstract | Crossref Full Text | Google Scholar

Yan, K., Shao, J., Zhu, Z., Zhang, K., Yao, J., and Tian, F. (2021). Display interface design for rollers based on cognitive load of operator. J. Soc. Inf. Disp. 29, 659–672. doi: 10.1002/jsid.1009

Crossref Full Text | Google Scholar

Yusa, I., Wulandari, A., Tamam, B., Rosidi, I., Yasir, M., and Setiawan, A. B. (2023). Development of Augmented Reality (AR) learning media to increase student motivation and learning outcomes in science. J. Inovasi Pendidikan IPA 9, 127–145. doi: 10.21831/jipi.v9i2.52208

Crossref Full Text | Google Scholar

Zhang, X., Jiang, S., Ordóñez de Pablos, P., Lytras, M. D., and Sun, Y. (2017). How virtual reality affects perceived learning effectiveness: a task–technology fit perspective. Behav. Inform. Technol. 36, 548–556. doi: 10.1080/0144929X.2016.1268647

Crossref Full Text | Google Scholar

Keywords: communication quality, extraneous cognitive load, learning outcomes, online virtual experiment teaching, system interaction quality, user interface quality

Citation: Yin P and Sun T (2026) The impact of system interaction quality on learning outcomes in online virtual experiment teaching: the mediating role of extraneous cognitive load. Front. Psychol. 16:1739300. doi: 10.3389/fpsyg.2025.1739300

Received: 04 November 2025; Revised: 23 December 2025;
Accepted: 26 December 2025; Published: 21 January 2026.

Edited by:

Prisla Ücker Calvetti, Federal University of Health Sciences of Porto Alegre, Brazil

Reviewed by:

Boróka Gács, University of Pécs, Hungary
Mira Suryani, Universitas Padjadjaran, Indonesia

Copyright © 2026 Yin and Sun. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: TingYu Sun, NTEwMjU2NzIwQHFxLmNvbQ==

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.