Your new experience awaits. Try the new design now and help us make it even better

ORIGINAL RESEARCH article

Front. Educ., 18 August 2025

Sec. Higher Education

Volume 10 - 2025 | https://doi.org/10.3389/feduc.2025.1555923

Using blended and online learning to increase appreciation of learning outcomes: case of a problematic game design unit

  • 1Education Futures, University of Western Australia, Perth, WA, Australia
  • 2School of Media and Communication, College of Law, Arts and Social Sciences, Murdoch University, Perth, WA, Australia
  • 3Independent Scholar, Perth, WA, Australia

Introduction: A key challenge for many academics is designing learning activities that are constructively aligned and effectively enhance students’ appreciation of their learning outcomes. This study investigates the impact of integrating active and blended learning strategies into a game design unit that had historically suffered from low student engagement and poor alignment with unit outcomes.

Methods: To address this issue, we introduced a series of active and blended learning activities, including an interactive design project, group work, the use of an online peer assessment tool, and online assessments requiring independent critical reflection and feedback on students’ learning experiences. A mixed-methods approach was employed. An online survey was administered via the LMS to 137 enrolled students, with 101 responses collected over a four-week period. Quantitative data were analyzed using ANOVA and linear multivariate analysis to assess the impact of these interventions.

Results and discussion: The findings suggest that the introduction of active and blended learning strategies—particularly those that increased student participation in lectures and group discussions—enhanced overall engagement and improved student satisfaction with the unit. Students appreciated the availability of online tools and resources; however, online engagement alone did not consistently lead to improved learning experiences. The data indicated that the effectiveness of online learning was significantly influenced by the presence of consistent and clear feedback.

Conclusion: Active and blended learning strategies, when supported by thoughtful learning design and timely feedback, are effective in engaging students with their learning outcomes and enhancing the overall student experience. These findings underscore the importance of integrating interactive and reflective components into course design to foster deeper student engagement.

1 Introduction

University students are more motivated to study when they understand how learning outcomes prepare them for professional roles (Drysdale and McBeath, 2018; Rahm et al., 2021). Online learning tools promise to enhance student engagement by making learning activities more interactive and relevant to their future careers (Olsson and Granberg, 2019; Race, 2010). Active participation in creating and supporting a “community of inquiry” can further highlight the value of their educational experiences (Baepler et al., 2014; Garrison and Kanuka, 2004).

The COVID-19 pandemic has underscored the importance of flexible learning environments (Singh et al., 2021). Blended learning, which combines online and offline resources, allows students to access materials anytime, anywhere, and at their own pace (Francis and Shannon, 2013; Poon, 2013). Well-designed blended and active learning experiences can significantly improve student engagement, especially for those who may initially be indifferent to the unit content but can gain valuable skills through their participation.

This study aims to identify and analyze activities that enhance student engagement and appreciation of learning outcomes through innovative learning designs in a game design unit. We explored how active and blended learning activities, such as group projects, peer assessments, online resources, and feedback mechanisms, impact student engagement and achievement of learning outcomes.

The primary research question guiding this study is: “What are the effects of active and blended learning activities on students’ learning experiences?” To address this, we formulated four sub-questions:

1. Design Project and Group Work: How do these activities enhance students’ knowledge and understanding to achieve learning outcomes? We hypothesize that design projects will improve students’ comprehension and appreciation of their learning outcomes (Jahnke et al., 2022; Pappas and Giannakos, 2021).

2. Online Peer Assessment: How do these activities encourage effective group work? We assume that using a peer assessment tool (SPARKplus) will foster more effective and reflective group collaboration (Cacciamani et al., 2021; Iglesias Pérez et al., 2022; Stanèiæ, 2021).

3. Online Resources and Lectures: How do these activities promote online engagement? We expect that online resources and lectures will increase student engagement with their learning (Brown et al., 2022; Capone and Lepore, 2021).

4. Critical Reflection and Feedback: How does feedback on critical reflection and essays improve students’ appreciation of learning outcomes? We anticipate that feedback will help students recognize the professional value of their learning and enhance their writing skills (Jensen et al., 2021).

Section one of this paper provides the background and literature supporting the rationale for using SABLE in the unit. Section two outlines our study’s methodology, followed by an analysis of the results in section three. The discussion and conclusions are presented in sections four and five, respectively.

2 Literature review

2.1 Interactive design for motivation and online resources for active learning

The COVID-19 pandemic has necessitated the adoption of online and blended learning approaches in universities worldwide. Artificial Intelligence (AI) and digital technologies have become essential tools for enhancing student learning (Awidi, 2024), making learning more flexible, interactive and access to information and learning resources more effective (Awidi and Paynter, 2024). Students can enjoy mix of online and in-person learning. In a team-based flipped classroom context, Kang and Kim (2021) found that a blended educational strategy significantly improved students’ knowledge, problem-solving abilities, and learning satisfaction compared to traditional lectures. They concluded that a flipped classroom with team-based learning enhances learning outcomes. However, Müller and Mildenberger (2021) conducted a systematic review and found only a small difference between blended and conventional classroom learning, suggesting equivalent learning outcomes. This indicates that while blended learning can be beneficial, its impact may vary depending on the context and implementation. Grønlien et al. (2021) explored a blended learning approach in a challenging course, integrating short narrated online resources and digital metacognitive evaluations. Their results showed that students using blended learning performed better than those in conventional settings. They emphasized that students’ attitudes toward learning outcomes are very important for the success of blended learning, which can be enhanced through motivation and engagement, such as gamification (Jayalath and Esichaikul, 2022).

2.2 Prior knowledge and learning outcomes

Designing a game project requires students to have prior knowledge and learn new concepts and techniques. Oakley and Sejnowski (2021) argue that students need to practice these skills with instructors and independently to develop new products. Ambrose et al. (2010) highlighted that prior knowledge can significantly affect learning, influencing students’ interest, perceived difficulty, effort estimation, and solution accuracy (Pozas et al., 2020). Dong et al. (2020) found that prior knowledge is positively associated with learning engagement. Project design activities, especially in group settings, can activate students’ prior knowledge and enhance engagement. Intentional course design is essential, as instructors need to introduce learning concepts and help students organize new knowledge for better encoding and retrieval (Garcia et al., 2021). Consistent integration and explanation of learning activities linked to outcomes can encourage positive and engaged attitudes.

2.3 Group work and collaborative learning

Group projects help students organize personal experiences and textbook knowledge to explore and analyse problems collaboratively. Hwang (2020) and Wu and Wu (2020) found that group work develops cognitive and critical thinking abilities through data collection, analysis, organization, and discussions. Jahnke et al. (2022) studied artifact-generated learning in student groups, identifying three levels of active learning: active, constructive, and interactive. They found that group projects can enhance engagement and performance, but overcoming resistance to active learning may require new assessment formats to encourage students to become co-designers.

Research suggests that active and blended learning environments encourage collaboration, improve interaction and learning gains, develop problem-solving skills, improve class attendance, overall performance, and attitudes toward learning (Adams et al., 2018; Asarta and Schmidt, 2020; Meltzer and Thornton, 2012). Beichner (2008) examined years of active learning research in different universities, assessing the impact of pedagogical approaches on student learning. Evidence from student interviews, focus groups, classroom videos, and audio recordings showed significant gains from the student-centered active learning environment with upside-down pedagogy (SCALE-UP). Beichner’s research found that collaborative design, where students work in groups on hands-on activities, simulations, or interesting questions, develops cooperative learning skills, shares valued experiences, and deepens subject understanding. This finding is confirmed by Soetanto and MacDonald (2017).

2.4 Motivation and engagement in learning design

Drawing from Ambrose et al. (2010) principles, effective active and blended learning requires student motivation. Students are less motivated if they cannot see the relevance of the subject to their future careers or personal interests (Drysdale and McBeath, 2018; Kember et al., 2008). To address this, the first 6 weeks of our study focused on the contemporary significance of games as persuasive media, including their use in procedural rhetoric and the “gamification” of work, marketing, and social interaction. Research indicates that engagement can be stimulated by explaining the link between learning activities and unit outcomes (Biggs and Tang, 2011). These weeks also introduced extrinsic motivations by outlining the value of the intended outcomes of the design assessment and the unit more broadly.

2.5 Design and implementation of learning activities

Initial lectures and topics were delivered face-to-face and supported by online recordings and optional topics available online for active learners. The final 7 weeks were devoted to active learning experiences, primarily around the summative game design, complemented by formative critical reflection assessments and various online learning opportunities. To free up time for the design project, unit topics covered in weeks 7 to 13 were recorded as formal lectures and made available online. Each lecture covered a topic that students could further research for their essay assignments. Students were asked to listen to the online lectures with learning objectives and intended outcomes in mind, devoting class time to the design project, including formal testing of design prototypes. Formative online critical reflections further scaffolded reflection on unit topics, asking students to reflect on their engagement with the material, including other game designs and group work activities.

This learning design was informed by Baepler et al. (2016), who describe four-dimensional combinations of learning spaces and pedagogical approaches: Firstly, traditional classroom for lectures, secondly, Active Learning Classroom (ALC) for lectures, thirdly, traditional classroom with primarily active learning, and fourthly, the ALC with primarily active learning activities. By blending traditional delivery with online material for active engagement and following up with an ALC for active learning, the unit was designed to pivot from the second to the fourth quadrant. While an entirely ALC-based pedagogy would maximize practice, feedback, and reflection, an initial period of traditional orientation was necessary due to motivation barriers in this unit.

The design should provide students with opportunities to prepare before class (pre-reading activities, videos, quizzes) (Awidi and Paynter, 2024; Baepler et al., 2016; Sun and Xie, 2020). Pre-class activities, in-class activities, and assignments should model real-world practices, providing students with a deep understanding of the subject (Awidi and Paynter, 2024; Bayley and Hurst, 2018). Online learning can enhance blended pedagogy and active learning by facilitating prompt, direct, and timely responses to students. Drinkwater et al. (2014) detailed the successful use of pre-class readings where students engaged with material before class and completed online quizzes; instructors could then review quiz results and address common difficulties. In-class learning should be collaborative and engaging, with lecturers and tutors available to respond to students’ questions or concerns while working on practical problems (Bayley and Hurst, 2018; Drinkwater et al., 2014). This design ensures students have adequate time to engage, receive feedback, and reflect on their learning through active learning activities.

In summary, while blended and active learning strategies show promise, their effectiveness depends on careful design and implementation. This literature review highlights the importance of prior knowledge, motivation, and engagement in enhancing learning outcomes through interactive and online resources.

2.6 Employment and scaffolding of online peer assessment software for group work

Effective active and blended learning environments require timely and personalized feedback from peers, which supports a “community of inquiry” (Shea et al., 2022). Beichner (2008) emphasizes that such feedback must be formative to improve student learning. To highlight the acquisition of versatile skills like teamwork and task management, group work activities were integrated into the unit’s learning activities. This included the periodic use of SPARKplus (Self and Peer Assessment Resource Kit—SPARK) to assess group progress throughout the semester. SPARK, a third-party software, facilitates self and peer assessment in group work, allowing reflection on contributions and progress (Willey and Gardner, 2010).

Using SPARK, students evaluated their own and their peers’ performance anonymously. Groups then reflected on their scores and considered ways to improve their functioning and progress as part of their critical reflection assessment. The goal was for students to approach teamwork more earnestly, manage their tasks effectively, and communicate expectations clearly. Regular assessment and feedback are aimed at ensuring active engagement with the unit’s versatile outcomes.

The success of these activities depends on students’ confidence, motivation, and engagement (Awidi and Paynter, 2019; Godlewska et al., 2019). Achieving an effective, active, and blended learning environment requires careful planning and quality feedback (Smith, 1996). To enhance the learning experience, we followed Boettcher and Conrad (2016) suggestion that instructors, rather than students, should form teams. Reflecting on the peer assessment process was made part of the unit’s assessment to encourage reflection and allow teaching staff to provide feedback on team performance.

2.7 Summary of changes to improve students’ appreciation of learning outcomes

The active and blended learning approach in this study used design projects, critical reflections, and essays as extrinsic motivations to engage students. Initially, face-to-face pedagogy helped students understand the value of learning outcomes. Subsequently, online resources and formative feedback were provided to enhance engagement and appreciation of these outcomes. Group work activities were reflected upon through online critical reflections and peer-assessment tools, as well as in-class activities. The following sections describe the research design process, challenges faced by lecturers and students, and present students’ perceptions of the active and blended learning activities, peer assessment, and feedback.

3 Materials and methods

3.1 Research model and procedure

This study adopted an action research approach, informed by Ivankova and Wingo (2018), who describe a four-phase cycle: reflecting, planning, acting, and observing. We employed mixed-methods throughout these phases to identify a relevant problem, reflect on possible solutions, plan a workable action plan, and evaluate it.

We used a mixed-method approach at both the design and data collection levels. This involved a quantitative approach with a student questionnaire containing 46 questions (both closed and open-ended). The instrument design was guided by the assumption that students construct their own learning knowledge when they can link their learning objectives with active learning activities and formative feedback (Biggs and Tang, 2015). According to the Awidi and Paynter (2019), this occurs effectively when students have access to clear, informative resources and motivational support, with assessments providing formative feedback to help them construct knowledge through reflection. When all processes are well aligned, students’ learning outcomes are achieved.

The mixed-method approach with the qualitative data (open-ended responses) provided an in-depth understanding of students’ perceptions of their learning experiences and how the learning activities affected their overall learning. Quantitative data gathered information on students’ project and group work activities, experiences with resources, assessments, and feedback. Both parametric and non-parametric statistics were used to analyze this data. Qualitative results provided explanations for the quantitative findings.

3.2 Research context and sample

3.2.1 The unit and its problems

The communication and media unit in this case study highlights how interactive systems or “games” are used in media applications as immersive communication tools. Students are required to consider how the design of interactive communication and media technologies affects social practices, including work, marketing, and engagement in the digital landscape. For the central assessment, students work in groups to create a playable analog game prototype. The unit also aims to develop very important skills in team building, project management, and product design and testing.

As a final-year unit focused on practical tasks and skills, many students approach it with skepticism, viewing games as distractions from real work. This lack of affinity has historically undermined their engagement and appreciation of the unit’s outcomes. However, end-of-program feedback revealed that students’ sentiments generally improve by the semester’s end and after graduation, as they realize the relevance of interactive system design and the importance of teamwork and project management skills. Students who understand these outcomes tend to gain more from the unit. For example, one student commented in an alumni survey, “The board game project was surprisingly one of the most helpful projects, teaching me a lot about teamwork, troubleshooting, and creative communication” (Student A, 2014 Cohort). Conversely, students who do not see the unit’s value tend not to engage constructively, leading to polarized opinions about the unit’s value (Alumni Survey, 2018).

3.2.2 The redesign and its justification

The challenge was to encourage student engagement regardless of their opinion of the “game-centered” content. Our review indicated that students appreciated the unit when they experienced its outcomes. After a course design workshop with learning designers and librarians, the unit was redesigned around an active and blended learning approach, considering four principles: motivation, practice, feedback, and reflection (Race, 2010). The goal was to help students understand the unit’s learning outcomes and their transferable value, regardless of their initial attitudes.

Educational researchers were engaged to re-evaluate the innovation and its impact on student learning. The redesign aimed to enhance the learning experience by incorporating active (authentic) and blended learning for small and large classes (Baepler et al., 2016; Godlewska et al., 2019; Wright et al., 2019). The principles of active and blended learning are rooted in social constructivism (Vygotsky and Cole, 1978), which posits that collaborative group work leads to higher-order learning and in-depth knowledge. As students work toward learning outcomes, they share and receive critical feedback, promoting teamwork and management (Shea et al., 2022).

3.2.3 Implementation of interactive design and group work

The interactive design involved collaborative activities such as discussions, problem-solving tasks, and peer reviews, integrated into the curriculum to enhance engagement and learning outcomes. Group work was structured around projects and assignments requiring collaboration and idea-sharing. Groups of 4–5 members were formed to ensure effective participation. The primary objectives were to encourage critical thinking, improve communication skills, and foster teamwork, providing practical experience in applying theoretical concepts.

The interactive design aimed to create a dynamic learning environment that promotes active participation and deeper understanding. It included interactive lectures, online discussion forums, and hands-on projects, all aligned with learning objectives to provide real-world application opportunities.

3.2.4 Implementation of active and blended online learning activities

The classroom design was an innovative mix of active and blended learning. To provide intrinsic motivation, the first few weeks of teaching highlighted the value of understanding games and introduced the versatile skills required for the game design project. Subsequently, students engaged with online resources to supplement their learning, applying these to their design projects and reflection assessments. This shift meant transitioning from primarily face-to-face learning to mostly online learning, supported by outcome-oriented scaffolding activities.

The study focused on the first implementation of the redesigned unit, surveying students after a 13-week semester. In collaboration with the Unit Coordinator (UC), researchers mapped out learning objectives and intended outcomes, designing SABLE activities around the design project, group work, peer assessment tool engagement, critical reflection, online resources, lectures, and feedback.

In a class of 137 enrolled students, 101 responded to the survey, resulting in a 74% response rate. This sample was considered representative of the student cohort. Two incomplete cases were deleted, leaving 99 cases for data cleaning and analysis. Participants were informed about the study’s aim and purpose and consented to the anonymous use of their data. Data collection was anonymous, ensuring no harm to participants. Anonymity was maintained through informed consent, emphasizing confidentiality and the absence of personally identifiable information. The university ethics committee approved the instruments, ensuring compliance with ethical standards.

3.2.5 Sampling

The sample comprised the entire class enrolled in the Media and Communication Unit at the University of Western Australia. This approach ensured comprehensive coverage and inclusivity, capturing diverse perspectives and experiences. By surveying the entire class, the study aimed to eliminate biases associated with selective sampling, providing a holistic view of students’ responses and experiences.

3.3 Research instrument and validation

The UC reviewed the survey instrument to ensure it reflected the intended measures. Due to limitations in running an experimental design, a mixed-methods approach was adopted to gather both quantitative and qualitative data. The survey included 42 five-point Likert scale questions (strongly disagree to strongly agree) categorized into six SABLE activities, and four open-ended questions for students to express their views (Supplementary Appendix A, T5.1–T5.5). With ethics committee approval, the instrument was pre-tested on 15 randomly selected students, and feedback was used to improve the question structure. The final survey was deployed via Qualtrics, with invitations sent through the university’s Learning Management System (LMS) over 4 weeks, and two in-class announcements encouraging participation.

Survey data were exported from Qualtrics to SPSS V23 for analysis. Missing data were replaced with the mean, and descriptive statistics were used to explore the data and identify outliers. Mahalanobis distance helped clean the outliers. Qualitative responses were grouped by keywords and themes, with relevant responses synthesized and summarized in Supplementary Appendix A. Cronbach’s alpha values were calculated to ensure the clusters of items measured the same construct, ensuring no significant outliers skewed the results.

3.4 Data analysis

3.4.1 Analysis of the research instrument and data

The open-ended survey responses provided detailed insights into participants’ experiences and perceptions. We conducted a thematic analysis, starting with an initial coding process where responses were read multiple times to identify recurring themes and patterns. Codes were assigned to text segments representing specific ideas or concepts. Similar codes were grouped into broader themes, which were then contextualized by comparing them with the quantitative findings. To ensure reliability and validity, all three researchers independently coded the data and discussed their findings to reach a consensus. This triangulation process minimized bias and enhanced the credibility of the analysis. Integrating qualitative themes with quantitative findings provided a richer, more detailed interpretation of the data, explaining the underlying reasons behind the quantitative results and offering deeper insights into participants’ experiences.

3.4.2 Reliability and validity

We used Cronbach’s coefficient alpha to determine the consistency of the multiple-item Likert scale. This measure assesses the reliability of items in measuring the same variables or underlying constructs and the extent to which each measure is free from error. Reliability tests were conducted for each design activity and variables necessary to answer the research questions. Initially, the Cronbach’s alpha for the 41 items creating the Student Active and Blended Learning Experience (SABLE) score was 0.940. Deleting items Q1.4, Q7.2, Q7.3, Q7.4, and Q7.7 increased the alpha score. A factor analysis (Supplementary Appendix: Table 5) was conducted to determine variables that favorably load to measure the underlying construct of SABLE. Items with a loading less than 0.270 were excluded from the analysis.

For the Design Project (DP) items, the Cronbach’s alpha was 0.866, indicating reasonable internal consistency. Deleting item Q1.4 increased the alpha to 0.895, so it was excluded. The Groupwork (GW) items scale had an alpha of 0.928, indicating good internal consistency. The Peer Assessment tool (SPARK) items scale had an alpha of 0.889, and the Critical Reflection (CR) items scale had an alpha of 0.929, which increased to 0.935 when Q5.7 was deleted. However, Q5.7 was retained as it was used as a dependent variable in one of the research questions. The Access and Engagement with online resources scale had minimal adequate reliability (alpha = 0.534). Deleting Q7.1, Q7.2, and Q7.7 increased the alpha scores. The Feedback (FB) items scale had an alpha of 0.869, indicating good internal consistency.

3.4.3 Statistical analysis

To test the study’s assumptions, we first performed a chi-square test for goodness of fit to establish students’ overall views and perceptions of the active and blended learning activities. The chi-square results are shown in Tables 1A,B. Next, we conducted a multiple linear regression analysis to determine the relationship or effect of SABLE on learning outcomes, as highlighted in the assumptions. Tables 1A,B present the chi-square results of students’ overall views and perceptions of SABLE.

TABLE 1
www.frontiersin.org

Table 1. Chi-square test for goodness of fit.

The non-parametric test for all the response items in the Likert scale 1–5 showed significant association. However, when the responses were re-grouped and categorized into disagree and agree three items under SPARK, one item under CR, and one under FEED showed no statistical significance—meaning there was no significant association between those who agreed and those who disagreed.

The researchers used multiple linear regression statistics to investigate the variables considered in this study and their effect on the student learning experience. The students’ active and blended learning experience (SABLE) activities, as shown in Table 1, were categorized and analyszd by design activities to establish their influence on the student learning experience.

Table 2 describes the variations within the statistical regression model and ANOVA results, showing the effect of active and blended learning activities on the student learning experience. Each designed activity had a statistically significant effect on the Student Learning Experience (SLE). Table 3 details the extent to which these design activities impact the SLE. The Unstandardized Coefficients (β) describe the extent of the predictor variables’ effect on the SLE, and the levels of significance are indicated by the sig values in Table 3. Only statistically significant variables predicting the extent of effect on the student active and blended learning experience (SABLE) are described in this table.

TABLE 2
www.frontiersin.org

Table 2. ANOVA results of active and blended learning activities effect on SLE.

TABLE 3
www.frontiersin.org

Table 3. Coefficients of linear multiple regression results —effect of SABLE activity on student learning outcome.

4 Results

Sections 3.1–3.4 directly address research questions 1–4. The first research question examines, “What design project and group work learning activities enhance students’ knowledge and understanding to achieve their learning outcomes?” The second question explores, “What online peer assessment activities encourage students to work effectively in groups?” The third question investigates, “What online resources and lecture activities encourage students’ online engagement?” Finally, the fourth question asks, “How does online critical reflection and written essay feedback help students improve their appreciation of learning outcomes?” Section 3.5 discusses the relationship between online engagement and the students’ learning experience, while Section 3.6 examines the effects of students’ writing on their learning outcomes.

4.1 Effects of design project activities in achieving learning outcomes

The design project (DP) learning activities aimed to develop communication and media studies competencies. By completing the project, students should be able to identify and understand how interactive systems work, comprehend how rules in these systems interact to produce different behaviors, and recognize how these systems generate interactivity. Additionally, the project aimed to develop strong teamwork, communication, and project management skills, linked to the intended learning outcomes (Biggs and Tang, 2011).

To determine whether the DP enhances students’ understanding in achieving their learning outcomes, six predictor variables were tested in a multiple linear regression model, with DP-Q1, DP-Q8, DP-Q9, and DP-Q10 as the dependent variables (see Table 1). The Pearson Correlation revealed significant correlations between the dependent variables and Q1, Q3, Q5, Q6, and Q7. However, only Q1 and Q2 showed strong correlations (Corr. > 0.5) with the dependent variables, while Q1.5 and Q1.6 were moderately correlated, and Q1.7 was weakly correlated.

The regression model, including all five predictor variables, produced an adjusted R2 of 64%, explaining the variability within the model and significantly predicting students’ understanding in achieving their learning outcomes [F(6, 92) = 30.47; ρ = 0.00]. However, the model’s coefficients revealed that not all predictors significantly influenced students’ understanding. DP-Q4, DP-Q5, DP-Q6, and DP-Q7 were not statistically significant. When DP-Q4 and DP-Q5 were removed, the adjusted R2 increased to 65%, and DP-Q6 became statistically significant (ρ = 0.004). This indicates that DP-Q4, Q5, and Q7 compounded the significance of DP-Q6.

The effect of the design project on students’ understanding can be expressed as:pc

Y D 1 P = 1.91 + 1.66 ( Q 1 ) + 1.37 ( Q 3 ) + 0.61 ( Q 6 )

With an unstandardized constant Beta value of β = 1.91, a unit increase in the predictor variable results in a positive increase in students’ understanding. The researchers concluded that the Project Design (DP) activities enhanced students’ understanding of the subject. When students view the design project as a crucial element of the unit (Q1), requiring strong teamwork, communication, and project management skills (Q3), and consider the project weighting reasonable with a whole week devoted to it (Q6), their understanding and learning outcomes are enhanced.

4.2 Effects of groupwork on students’ knowledge of subject and teamwork skills

The group work activities were designed to help students develop the skill of organizing and working as part of a group. The Pearson Correlation revealed that all the predictors strongly correlated significantly with the dependent variable (Group work develops students’ ability to defend and clearly articulate their point of view [Corr. > 0.5]). Given the significant relationship, the regression model was used to determine the extent to which each of the predictor variables contributes to the student’s ability to defend and clearly articulate their point of view.

Using GW-Q2.2 (students’ ability to defend and articulate their point of view) as the dependent variable and GW-Q2.1 (organization skills), GW-Q2.3 (collaboration skills), and GW-Q2.4 (communication skills) as predictor variables, the model summary produced an adjusted R2 of 81%, indicating good variation within the students’ responses. The ANOVA results (Table 2) showed that all three variables (GW-Q2.1, GW-Q2.3, and GW-Q2.4) significantly explain how group work enhances students’ understanding of the subject [F(3, 95) = 139.7; ρ = 0.000]. This is confirmed by the coefficients of the GW model, which showed that all three predictors significantly predict the effect of group work in enhancing student knowledge of the subject (Table 3). Hence, with an unstandardized constant Beta coefficient value of β = 0.051, the effect of group work on the students’ learning can be written as:

Y G 1 W = 0.05 + 0.17 ( Q 2.1 ) + 0.60 ( Q 2.3 ) + 0.21 ( Q 2.4 )

Thus, a unit increase in developing effective group work activities would result in an increase in students’ ability to defend and articulate their points of view well. The researchers concluded that the design of group work activities helped the students to explain, defend, and clearly articulate their points of view to their peers. Such activities help them to develop the ability to work with others.

For such a positive relationship in 4.1 and 4.2 we conclude that, the design project and group work improve students’ comprehension and appreciation of their desired learning outcome.

4.3 Effects of online peer assessment tools on students’ participation in groupwork activities

The Pearson correlation coefficient showed that the dependent variable (students’ participation in group work) strongly and significantly correlated (Corr. > 0.5) with the predictor variables Q3.1 (peer feedback), Q3.2 (ease of use), Q3.3 (fairness), and Q3.4 (workload management). Thus, each of the predictor variables has a relationship with the dependent variable.

The analysis investigating the relationship between the peer assessment tool (SPARK) introduced to the students and its effect on student encouragement to work in groups produced an adjusted R2 of 60.2% [F(4, 94) = 30.0; ρ = 0.000] with S-Q3.5 (students’ willingness to continue using the tool) as the dependent variable and S-Q3.1, Q3.2, Q3.3, and Q3.4 as the predictor variables (Table 2). Considering the complexities associated with students working in teams and groups, the four predictors were examined to review students’ interest and whether they would continue to use the tool/software for peer assessment. The results from the four variables showed a statistically significant predictor effect of students’ expression of continuous use of the tool/software for peer assessment in team and group work [F(4, 92) = 27.43; ρ = 0.000]. However, the coefficient of the tool/software effect model shows that S-Q3.2 does not significantly predict continuous use and was therefore deleted, while Q3.3 showed statistically marginal significance (ρ = 0.052), although the two contributed to the overall significance (Table 3). Hence, the predictors that can explain the tool/software’s (SPARK) continuous usage within the context can be written as:

Y S 1 = 0.85 + 0.39 ( Q 3.1 ) + 1.5 ( Q 3.3 ) + 2.8 ( Q 3.4 )

The researchers concluded that students would continue to use the tool/software in the future when it enables them to understand how their contribution to the group work was assessed by the team, helps them to effectively manage their workloads, and makes the group work process fairer. The researchers observed that teamwork and project management were encouraged within the groups when they understood how their contribution to the group had been assessed by the rest of the group. They were therefore happy using the tool/software because it helped them effectively manage the group workload.

4.4 Effects of critical reflection on students’ understanding of topics

To understand the relationship between Critical Reflection (CR) and its effect on students’ understanding of the topics in the subject, CR-Q5.2 (students’ understanding of topics) was used as the dependent variable and CR-Q5.1 (clarity of tasks), Q5.3 (relevance to learning), Q5.4 (engagement), Q5.5 (confidence), Q5.6 (feedback), and Q5.7 (reflection) were used as the predictor variables. The model summary showed an adjusted R2 of 80.1%, indicating good variability within the model. The ANOVA results indicated that the online-CR model with all six predictor variables significantly predicts students’ understanding of the topics treated in the subject (F (6, 92) = 13.59; ρ = 0.000) (Table 2). However, the coefficient of the online-CR model indicates that while CR predictors Q5.1, Q5.3, and Q5.5 contributed significantly to the students’ understanding of the topics treated (Table 3), predictors Q5.4, Q5.6, and Q5.7 did not. Hence, the relationship effect of the model for participating in the online-CR can be written as:

Y C 1 R = - 0.077 + 0.31 ( Q 5.1 ) + 0.28 ( Q 5.3 ) + 0.45 ( Q 5.5 )

The researchers therefore concluded that the online CR enhances students’ understanding of the topic further when it is clear to them what they are to do (Q5.1), when the online CR is directly relevant to what they are learning in the unit (Q5.3), and they feel confident to participate in the online CR (Q5.5). The researchers argue that helping students to develop their confidence is important in enhancing CR and students’ understanding.

4.5 Online engagement and the student learning experience

In investigating the relationship between online resources and students’ preparedness to use those resources online compared with attending face-to-face sessions, Q7.3 (students’ preference for online resources) was considered the dependent variable, while Q7.1 (ease of access), Q7.2 (satisfaction with online content), Q7.4 (coverage of topics), Q7.5 (interaction quality), Q7.6 (technical issues), Q7.7 (engagement level), and Q7.8 (flexibility) were used as predictor variables. The model summary showed an adjusted R2 of 9.3%, indicating poor variability within the model, which suggests that no definite conclusions can be drawn. However, the ANOVA results [F(7, 91) = 2.88; ρ = 0.025] suggest that all seven predictor variables together significantly affect students’ preferences for accessing resources and lectures online rather than in face-to-face sessions (Table 2).

The researchers assume that with interactive activities both online and in-class, students are likely to access online lecture resources in conjunction with in-class activities. By eliminating variables that were not contributing significantly to the students’ preferences, it was found that Q7.2 (satisfaction with online content) and Q7.4 (coverage of topics) statistically contributed significantly to the weak model (Table 3). The model suggests that when students are satisfied with the online optional topics and associated online lectures (Q7.2), and are not disappointed by the lack of topics covered in lectures after week 7 (Q7.4), they would prefer to access all lectures online rather than in the lecture theater. Optional topics for essays were not considered to influence this preference. Hence, although all seven predictor variables appear to affect students’ preference for accessing online resources and lectures, only the two variables based around satisfaction with the number of topics and online content appear to be influencing the students’ choice.

The weak model can therefore be written as:

Y A 1 O R = 1.84 + 0.19 ( Q 7.2 ) + 0.25 ( Q 7.4 )

The researchers inferred that students may prefer online resources if the optional topics and associated online lectures are relevant for their learning, and they are not disappointed by the lack of topics and lectures after week 7. However, the researchers were unable to make any generalizations from this model given the poor R2 value of 9.3%.

4.6 Effects of feedback on students writing and learning outcomes

In investigating the relationship between feedback on students’ essay-writing and their learning outcomes, CR-8.5 (improvement in learning and writing) was used as the dependent variable, with CR-Q8.1 (clarity of feedback), Q8.2 (timeliness of feedback), Q8.3 (detail of feedback), Q8.4 (relevance of feedback), Q8.6 (constructiveness of feedback), and Q8.7 (feedback on strengths and weaknesses) as the predictor variables. Results of the analysis indicated that all six predictor variables significantly contributed to improvement in their learning and writing, with an adjusted R2 of 80% and ANOVA results of [F(6, 92) = 66.25; ρ = 0.000] (Table 2). All predictor variables of the feedback effect together indicated a statistically significant effect in contributing to improvement in students’ learning and writing. However, only Q8.7 (feedback on strengths and weaknesses) appears to have individually shown some statistically significant influence. Eliminating the variables with higher statistical significance, Q8.4 (relevance of feedback), Q8.6 (constructiveness of feedback), and Q8.7 (feedback on strengths and weaknesses), indicated strong statistical significance in predicting improvement in student learning and writing (Table 3). Hence, the model of feedback’s effect on the improvement in students’ learning and writing can be written as:

Y F 1 E D _ L W = - 0.095 + 0.18 ( Q 8.4 ) + 0.23 ( Q 8.6 ) + 0.59 ( Q 8.7 )

The researchers concluded that feedback on active learning in relation to essays will help students improve their writing when the feedback they receive is clear, returned within a reasonable time, and is detailed enough to help students understand the strengths and weaknesses of their essay.

Further analysis was conducted to see if feedback on online Critical Reflections (CR) helped students improve their learning. This showed that clarity of the feedback students receive (Q8.1), and the timeliness of the feedback they receive (Q8.3), together significantly predicted students’ improvement in their learning, with an adjusted R2 value of 64%, showing variation in the model, and an ANOVA result of [F(2, 97) = 87.39; ρ = 0.000]. The coefficient of the model further confirmed the statistical significance of the two variables, with phi values of ρ = 0.000 (Q8.1) and ρ = 0.042 (Q8.3). Hence, the model for the relationship between feedback on online-CR and improvement in students’ learning can be written as:

Y F 1 C R = 0.095 + 0.71 ( Q 8.1 ) + 0.15 ( Q 8.3 )

The researchers therefore argued that feedback on students’ critical reflections will improve their learning when the feedback they receive on their reflections is clear and presented to them within a reasonable time.

5 Discussion

This study was conducted to explore the extent to which a unit’s redesign along SABLE principles helped improve students’ engagement with the learning outcomes. Our survey sought to establish whether students perceived that the design project activities enhanced their understanding in achieving the learning outcomes of the unit, and whether the group work activities enhanced their knowledge of the subject. Additionally, the researchers aimed to understand whether conducting online reflections about group work would help students appreciate the outcomes associated with this work, and finally, to assess how feedback helps students improve their learning and engagement (see Supplementary Appendix A: Table 5.1).

Overall, the changes improved student satisfaction with the unit as a whole. At the conclusion of the semester, the university-administered student survey tool showed improved satisfaction scores for the unit, improving in all six categories of the survey. Significant jumps were observed in responses to the following statements: ‘It was clear what I was supposed to learn in this unit’ from 2.79 to 3.14 (on a scale of 4); ‘The unit was well organized’ from 2.87 to 3.04 (scale of 4); and “Overall this unit was a good educational experience” from 2.73 to 3.04 (on a scale of 4). One piece of written feedback summed up the improvement nicely:

I’ve heard horror stories about this unit from friends who’ve graduated with a comms major in the last few years. I understand there have been changes made to the unit, and given I found it was quite easy to manage and engage with, I assume the changes have been very positive in improving the unit” (Student survey feedback).

What follows is a discussion of the effectiveness of each of the components of learning design that we analyzed.

5.1 Impact of design project on learning outcomes

The Multiple Linear Regression (MLR) results indicate that the design project activities enhanced students’ understanding in achieving the learning outcomes. This success was based on three critical variables:

• Relevance of the Design Project: Students’ understanding that the design project was a necessary and important part of the unit was important. This confirms findings by Drysdale and McBeath (2018) and Kember et al. (2008) that teaching basic theory without application can demotivate students. When students see the relevance of the content, they were more motivated to engage with their learning.

• Teamwork and Communication Skills: Strong teamwork, communication, and project management skills were key to students’ understanding of the subject. This aligns with Rooij (2009), who found that project management approaches facilitate intra-team communication and positive collaborative behavior. According to Jahnke et al. (2022), students are motivated by networking, career opportunities, and learning new things, but they are most inspired by what they can achieve at the end of the project while applying project management skills.

• Perception of Weighting: The perception that the design project weighting was reasonable encouraged students to engage in teamwork activities, helping them reach the intended outcomes.

5.2 Effectiveness of group work

Although several factors explain the effectiveness and benefits of teamwork to students’ learning (Beichner, 2008), we did not anticipate that students would recognize their improved capacity to clearly defend and articulate their point of view as a significant outcome. The results revealed that developing the ability to work with others through the design project and collaborative activities helped students explain their contributions within the teams. The interactive activities helped them clearly explain their points of view, which was part of the expected learning outcomes.

Another key factor was developing the ability to defend their point of view respectfully to their colleagues. This respectful approach helps all students contribute to the successful completion of the task. These interactive activities created opportunities for students to consider and accept different points of view within the teams, thereby enhancing their learning experience. For example, one student commented in the survey:

Came into the unit with negative opinions of previous students in mind and was pleasantly surprised how enjoyable I have found it, despite the aspect of teamwork making up a large component. I am fortunate to have worked with a great group but would likely feel differently about the unit if my group for the design project was different, i.e., not willing to contribute.”

5.3 Impact of online assessment tool (SPARK) on group work

The challenges students face while working in groups can affect their understanding of the subject, depending on their position on the spectrum of learning. Some common challenges include a disparity between individual contributions to the group work and grades awarded for the team. Using an online resource to support group work assessment and learning may require careful management of students’ perceptions of how the system supports their concerns.

Our results suggest that the key benefit of using online peer assessment software was that it helped students manage group members’ workloads. However, the software did not noticeably encourage discussion about teamwork and project management. It was also noted that acceptance and continuous use of the resource was significantly linked to how effectively the resource helped students understand how their contribution to the group was assessed by the rest of the team. Hence, the fairer the system is perceived to be, the more students feel encouraged to use it in support of their learning.

Supplementary Appendix A: Table 5.2 provides some insight into the online resource. For example, a student noted in a recommendation that it would be much better if students were only required to do SPARK at the end of the project. The student noted: “I believe as adults and university students, we all have the ability to communicate to solve problems without having to do so by reading comments given to us by anonymous people.” This suggests that more capable students may have been encumbered by the use of the online assessment tool.

5.4 Impact of critical reflection on learning outcomes

The study revealed that critical reflection (CR) enhanced students’ understanding of the subject. Key elements supporting this outcome are supported by literature, primarily the alignment of learning activities with the learning outcomes (Biggs and Tang, 2011; Guerrero-Roldán and Noguera, 2018). The results showed that what the critical reflection exercises asked the students to do was directly relevant to what they were learning in the unit, and the 20% weighting for the online critical reflection was perceived as reasonable. These factors made the students feel more confident with their learning experience, indicating the significance of critical reflection in enhancing the students’ learning experience (Race, 2010; Thompson et al., 2003).

It was also established that online engagement in the unit was enhanced by the optional topics and associated lectures that the unit coordinator posted online. The students were satisfied to have completed their teamwork activities. As some students remarked:

I prefer the CR as a substitute for tutorial participation marks because as someone who is quite introverted, class participation makes me nervous but by having the CR I am able to gain the marks that I need” (S1).

It forced one to delve into that week’s topic, so if you didn’t understand the topic in the lecture or tutorial, you had another chance on your own to understand the topic. It did help my learning and for those who don’t go to lectures, it does force them to study what was talked about anyway” (S3).

Effects of the CR activity on the students’ experience is summarized in Figure 1.

FIGURE 1
www.frontiersin.org

Figure 1. Students’ perception of CR experience effect on their learning.

Supplementary Appendix A: Tables 5.3, 5.4 provide some insight into the students’ perceptions of the online critical reflection and about their understanding of the subject, engagement, and motivation.

5.5 Impact of feedback on learning outcomes

The findings on the effect of feedback on students’ learning experiences are supported by literature (Biggs and Tang, 2011; Martin et al., 2018). Feedback on the online critical reflections and essays indicated that it enhanced students’ writing, which was one of the core intended learning outcomes of developing professional writing skills for use in manual writing and other professional contexts (Course Handout, 2016). Critical to these outcomes was that the feedback students received was clear and given within a reasonable time (see Supplementary Appendix A: Table 5.5). Students perceived the feedback as detailed enough to help them understand their strengths and weaknesses. These attributes helped the students improve their learning.

5.6 Overall student experience

The results of this study indicate that the design project activities in this communication and media studies unit enhanced students’ understanding and achievement of the intended learning outcomes. The group work activities also enhanced students’ knowledge about the subject as they participated and collaborated in the design project activities. Introducing the online peer assessment resource encouraged students to work effectively in groups. Online resources and lectures further encouraged students to engage more with their learning online. The feedback given to students helped them improve their learning and writing skills. Some students remarked:

“Critical reflection allowed me to analyse the lecture or week’s topic materials in a way that would translate into a practical game sense. I was able to use the critical reflection as an opportunity to reflect on the material and how it would translate specifically into our game or how we could use mechanics discussed in our design project” (S15).

“I am not a fan of games, but this unit has nevertheless opened my eyes to a new world of interactive systems. It has definitely made an impact on my actions and how I think about how games work, and I am grateful and proud to say that I have had the chance to experience what was taught to me in this unit” (S36).

Overall, most (63.8%) of the students thought it was useful and enjoyable to study games, and that the unit was far more interesting than some courses they thought were ‘dry theoretical units’ which they took in the first and second years. One student noted, “I think it would be more useful to write another essay and really get into the study of games more.” Some students shared their experience of having students from other disciplines playtest their games and expressed satisfaction with participating in the unit. They felt the unit was based around games/game development with a logic that goes behind the process of making something more meaningful and related to everyday life.

However, some students (2) still felt that designing a game did not achieve the learning outcomes they expected, as they focused solely on completing the game rather than understanding the broader systems. One student remarked, “Personally, while a couple of the topics will stick in my mind, I don’t see the unit contributing to my future in any relevant way.” Another suggested that the unit might be better served as a topic within a unit. Some students failed to understand why gaming should be used as a learning approach in a third-year unit, suggesting it could be done in earlier years. One student remarked, “I would rather do this unit in first or second year as opposed to third because getting into postgraduate is incredibly important and necessary for people, and if gaming is not interesting to them at all, it adversely affects their grades.” Another student felt that the design project failed to significantly change their perspective on life, stating that game design was not worth investigating for a whole unit if students are not taking it as a career. These perceptions highlight the need to further motivate students toward understanding the value and versatility of the unit’s outcomes.

In general, the study’s findings have shown promising results that can be replicated across different disciplines. The redesign will be implemented in the course and evaluated over time. The goal is to evaluate its sustainability and consistency over time. To ensure the continuous improvement and long-term success of the redesigned unit, reimplementation, longitudinal evaluation, continuous feedback, adaptation, and scaling strategies will be adopted. These processes will be documented for publication.

6 Recommendations

In response to the research questions, we recommend the following activities be intentionally added in designing courses that students find challenging:

6.1 Research question 1: design projects and group work

a) Design Projects:

• Link to Learning Outcomes: Ensure that students can clearly see the connection between project activities and learning outcomes. This helps them understand the relevance of the project.

• Enhance Skills: Structure activities to enhance teamwork, communication, and project management skills. These skills are very important for students’ professional development.

• Fair Weighting: Ensure that the weighting of learning activities is perceived as fair relative to students’ learning input. This encourages engagement and effort.

b) Group Work Activities:

• Encourage Interaction: Structure group work to encourage students to interact in ways that develop their ability to work with others. This enhances collaboration and mutual support.

• Defend Points of View: Develop activities that help students defend their points of view respectfully. This promotes critical thinking and respectful discourse.

• Consider Different Perspectives: Foster the skill of considering different points of view. This broadens students’ understanding and appreciation of diverse perspectives.

6.2 Research question 2: online assessment tools

a) Online Assessment Tools:

• Understand Peer Assessment: Ensure that online assessment tools help students understand how their contributions to group work are assessed by peers. This promotes transparency and fairness.

• Manage Workload: Assist groups in effectively managing their workload. This helps distribute tasks evenly and reduces stress.

• Fair Assessment: Ensure that the assessment process is perceived as fair. This encourages continued use and trust in the tool.

6.3 Question 3 and 4: optional topics and critical reflection

a) Optional Topics and Lectures:

• Clear Labeling: Clearly label and link optional topics to learning outcomes. This helps students understand the relevance of each topic.

• Link Material and Lectures: Ensure that topic material and lectures are clearly linked and related to each other. This provides a cohesive learning experience.

b) Online Critical Reflection:

• Clear Requirements: Clearly communicate what is required in the critical reflection activity. This helps students understand expectations.

• Relevant Instructions: Ensure instructions are relevant to learning outcomes and aim to increase students’ confidence. This promotes engagement and learning.

• Timely Feedback: Provide clear, concise, and timely feedback that helps students understand their strengths and weaknesses. This supports continuous improvement.

In general, our findings confirm the relevance of Biggs and Tang (2015) constructive alignment. Properly linking online activities to learning outcomes and associating these outcomes with specific skills or competencies in project and group work activities enhances student engagement with their learning. A significant lesson from the course redesign was that, critical reflection activities at the end of each topic enhance student’s knowledge and understanding of the subject.

6.4 Future research

Our case study suggests that active and blended innovations, with increased student participation in lectures and group discussions, enhanced students’ overall engagement with learning and improved their satisfaction with the unit. However, we also found that while students appreciated having access to online tools and resources, online engagement alone did not invariably improve the student learning experience. Maximizing the benefits of online learning requires consistent and understandable feedback. We believe that the fact that this was the first iteration of the redesign contributed to some implementation problems. Although the improvement in student engagement with outcomes was limited, there were isolated issues, such as some students not receiving prompt and clear feedback, and instructors needing more time and feedback to adapt teaching materials to appropriately address student motivations. These teething issues clearly impacted the student engagement with both the SABLE techniques and the unit’s learning outcomes.

The absence of historical data for comparisons and policies that restrict using some students as an experimental group and others as a control group also posed challenges. The instrument was designed to focus on the effect of the innovation in achieving the intended learning outcomes, thereby limiting the questions to the specifics of the unit. However, comparing ongoing student survey results has allowed for some reflection on the impact of the changes over time. The discrete nature of our research instrument means that this study could serve as a base for future studies.

7 Conclusion

This research explored the effect of redesigning a media and communication unit around students’ active and blended learning activities to make students more aware of, and appreciative of, the value of the unit’s learning outcomes. Specifically, the study examined the effects of a design project, teamwork, online resources, critical reflection, and feedback. The redesign enhanced students’ understanding and achievement of the intended learning outcomes. The results identified factors that influenced the effectiveness of the innovations in improving students’ learning experiences. The active and blended learning activities were shown to enhance the student learning experience by more actively engaging students with the learning outcomes of the unit. Despite the challenges, the active learning activities generated interactivity and engagement with students, thereby enhancing their learning.

The study included feedback from the instructors in charge of the unit, which supported the students’ responses. Instructors provided insights into the effectiveness of the redesigned unit and highlighted areas where students showed improved engagement and understanding. Additionally, the unit evaluations conducted at the end of the semester indicated increased student satisfaction across various categories, aligning with the study’s findings. These evaluations included quantitative data from surveys and qualitative feedback from students, which corroborated the positive impact of the redesign on student learning experiences. Details are contained in the report to the Dean (Futures). While the manuscript presents the findings in a general manner, it is important to consider the context-specific nature of educational interventions. The principles and strategies used in this study, such as active and blended learning, designing projects, and online assessment tools, can be generalized to other disciplines or academic contexts. However, successful implementation in different settings may require contextual adaptation to address the unique needs and challenges of those environments. Future research should explore the applicability of these strategies in diverse academic disciplines to validate their effectiveness and identify any necessary modifications. The success of this study indicates that design innovation has been successful, and students’ concerns can be reviewed to improve the continuous use and expansion of active and blended learning activities to other units.

Data availability statement

The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.

Ethics statement

The studies involving humans were approved by the University of Western Australia. The studies were conducted in accordance with the local legislation and institutional requirements. The participants provided their written informed consent to participate in this study. Written informed consent was obtained from the individual(s) for the publication of any potentially identifiable images or data included in this article.

Author contributions

IA: Conceptualization, Data curation, Formal Analysis, Methodology, Writing – original draft, Writing – review and editing. TH: Conceptualization, Resources, Validation, Writing – review and editing. DS: Conceptualization, Investigation, Resources, Validation, Writing – original draft.

Funding

The author(s) declare that no financial support was received for the research and/or publication of this article.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Generative AI statement

The authors declare that no Generative AI was used in the creation of this manuscript.

Publisher’s note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

Supplementary material

The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/feduc.2025.1555923/full#supplementary-material

References

Adams, V., Burger, S., Crawford, K., and Setter, R. (2018). Can you escape? creating an escape room to facilitate active learning. J. Nurses Prof. Dev. 34, E1–E5. doi: 10.1097/nnd.0000000000000433

PubMed Abstract | Crossref Full Text | Google Scholar

Alumni Survey (2018). Communication and Media Studies Alumni Tracer Experience Survey. “How has the experience and participation in the communication and media studies impacted your professional experience?”

Google Scholar

Ambrose, S. A., Bridges, M. W., DiPietro, M., Lovett, M. C., and Norman, M. K. (2010). How learning works: Seven research-based principles for smart teaching. Hoboken, NJ: John Wiley & Sons.

Google Scholar

Asarta, C. J., and Schmidt, J. R. (2020). The effects of online and blended experience on outcomes in a blended learning environment. Int. Higher Edu. 44:100708. doi: 10.1016/j.iheduc.2019.100708

Crossref Full Text | Google Scholar

Awidi, I. T. (2024). Comparing expert tutor evaluation of reflective essays with marking by generative artificial intelligence (AI) tool. Comput. Educ. Artif. Intell. 6:100226. doi: 10.1016/j.caeai.2024.100226

Crossref Full Text | Google Scholar

Awidi, I. T., and Paynter, M. (2019). The impact of a flipped classroom approach on student learning experience. Comput. Educ. 128, 269–283. doi: 10.1016/j.compedu.2018.09.013

Crossref Full Text | Google Scholar

Awidi, I. T., and Paynter, M. (2024). An evaluation of the impact of digital technology innovations on students’ learning: Participatory research using a student-centred approach. Technol. Knowl. Learn. 29, 65–89. doi: 10.1007/s10758-022-09619-5

Crossref Full Text | Google Scholar

Baepler, P., Walker, J. D., and Driessen, M. (2014). It’s not about seat time: Blending, flipping, and efficiency in active learning classrooms. Comp. Educ. 78, 227–236. doi: 10.1016/j.compedu.2014.06.006

Crossref Full Text | Google Scholar

Baepler, P., Walker, J. D., Brooks, D. C., Saichaie, K., and Petersen, C. I. (2016). A guide to teaching in the active learning classroom: History, research, and practice. Virginia, VA: Stylus.

Google Scholar

Bayley, T., and Hurst, A. (2018). Teaching line balancing through active and blended learning. Dec. Sci. J. Innov. Educ. 16, 82–103. doi: 10.1111/dsji.12148

Crossref Full Text | Google Scholar

Beichner, R. (2008). The SCALE-UP Project: A student-centered active learning environment for undergraduate programs. An invited white paper for the National Academy of Sciences.

Google Scholar

Biggs, J., and Tang, C. (2011). Teaching for quality learning at University. Maidenhead: Open University Press.

Google Scholar

Biggs, J., and Tang, C. (2015). “Constructive alignment: An outcomes-based approach to teaching anatomy,” in Teaching anatomy, eds L. Chan and W. Pawlina (Cham: Springer), 31–38.

Google Scholar

Boettcher, J. V., and Conrad, R.-M. (2016). The online teaching survival guide: Simple and practical pedagogical tips. San Francisco, CA: Jossey-Bass.

Google Scholar

Brown, A., Lawrence, J., Basson, M., and Redmond, P. (2022). A conceptual framework to enhance student online learning and engagement in higher education. High. Educ. Res. Dev. 41, 284–299. doi: 10.1080/07294360.2020.1860912

Crossref Full Text | Google Scholar

Cacciamani, S., Perrucci, V., and Fujita, N. (2021). Promoting students’ collective cognitive responsibility through concurrent, embedded and transformative assessment in blended higher education courses. Technol. Knowledge Learn. 26, 1169–1194. doi: 10.1007/s10758-021-09535-0

Crossref Full Text | Google Scholar

Capone, R., and Lepore, M. (2021). From distance learning to integrated digital learning: A fuzzy cognitive analysis focused on engagement, motivation, and participation during COVID-19 pandemic. Technol. Knowl. Learn. 27, 1259–1289. doi: 10.1007/s10758-021-09571-w

Crossref Full Text | Google Scholar

Dong, A., Jong, M. S.-Y., and King, R. B. (2020). How does prior knowledge influence learning engagement? The mediating roles of cognitive load and help-seeking. Front. Psychol. 11:591203. doi: 10.3389/fpsyg.2020.591203

PubMed Abstract | Crossref Full Text | Google Scholar

Drinkwater, M. J., Gannaway, D., Sheppard, K., Davis, M. J., Wegener, M. J., Bowen, W. P., et al. (2014). Managing active learning processes in large first year physics classes: The advantages of an integrated approach. Teach. Learn. Inquiry: ISSOTL J. 2, 75–90. doi: 10.2979/teachlearninqu.2.2.75

Crossref Full Text | Google Scholar

Drysdale, M. T., and McBeath, M. (2018). Motivation, self-efficacy and learning strategies of university students participating in work-integrated learning. J. Educ. Work 31, 478–488. doi: 10.1080/13639080.2018.1533240

Crossref Full Text | Google Scholar

Francis, R., and Shannon, S. J. (2013). Engaging with blended learning to improve students’ learning outcomes. Eur. J. Eng. Educ. 38, 359–369. doi: 10.1080/03043797.2013.766679

Crossref Full Text | Google Scholar

Garcia, I., Grau, F., Valls, C., Piqué, N., and Ruiz-Martín, H. (2021). The long-term effects of introducing the 5E model of instruction on students’ conceptual learning. Int. J. Sci. Educ. 43, 1441–1458. doi: 10.1080/09500693.2021.1918354

Crossref Full Text | Google Scholar

Garrison, R. D., and Kanuka, H. (2004). Blended learning: Uncovering its transformative potential in higher education. Int. High. Educ. 7, 95–105. doi: 10.1016/j.iheduc.2004.02.001

Crossref Full Text | Google Scholar

Godlewska, A., Beyer, W., Whetstone, S., Schaefli, L., Rose, J., Talan, B., et al. (2019). Converting a large lecture class to an active blended learning class: Why, how, and what we learned. J. Geography High. Educ. 43, 96–115. doi: 10.1080/03098265.2019.1570090

Crossref Full Text | Google Scholar

Grønlien, H. K., Christoffersen, T. E., Ringstad, Ø,Andreassen, M., and Lugo, R. G. (2021). A blended learning teaching strategy strengthens the nursing students’ performance and self-reported learning outcome achievement in an anatomy, physiology and biochemistry course–A quasi-experimental study. Nurse Educ. Pract. 52:103046. doi: 10.1016/j.nepr.2021.103046

PubMed Abstract | Crossref Full Text | Google Scholar

Guerrero-Roldán, A.-E., and Noguera, I. (2018). A model for aligning assessment with competences and learning activities in online courses. Int. High. Educ. 38, 36–46. doi: 10.1016/j.iheduc.2018.04.005

Crossref Full Text | Google Scholar

Hwang, G.-J. (2020). “E-learning and innovative education: Strategies for adding innovation and value to educational research,” in Innovating education in technology-supported environments, eds K. C. Li, E. Y. M. Tsang, and B. T. M. Wong (Berlin: Springer), 109–115.

Google Scholar

Iglesias Pérez, M., Vidal-Puga, J., and Pino Juste, M. (2022). The role of self and peer assessment in Higher Education. Stud. High. Educ. 47, 683–692. doi: 10.1080/03075079.2020.1783526

Crossref Full Text | Google Scholar

Ivankova, N., and Wingo, N. (2018). Applying mixed methods in action research: Methodological potentials and advantages. Am. Behav. Sci. 62, 978–997. doi: 10.1177/0002764218772673

Crossref Full Text | Google Scholar

Jahnke, I., Meinke-Kroll, M., Todd, M., and Nolte, A. (2022). Exploring artifact-generated learning with digital technologies: Advancing active learning with co-design in higher education across disciplines. Technol. Knowledge Learn. 27, 335–364. doi: 10.1007/s10758-020-09473-3

Crossref Full Text | Google Scholar

Jayalath, J., and Esichaikul, V. (2022). Gamification to enhance motivation and engagement in blended elearning for technical and vocational education and training. Technol. Knowledge Learn. 27, 91–118. doi: 10.1007/s10758-020-09466-2

Crossref Full Text | Google Scholar

Jensen, L. X., Bearman, M., and Boud, D. (2021). Understanding feedback in online learning–A critical review and metaphor analysis. Comp. Educ. 173:104271. doi: 10.1016/j.compedu.2021.104271

Crossref Full Text | Google Scholar

Kang, H. Y., and Kim, H. R. (2021). Impact of blended learning on learning outcomes in the public healthcare education course: A review of flipped classroom with team-based learning. BMC Med. Educ. 21:78. doi: 10.1186/s12909-021-02508-y

PubMed Abstract | Crossref Full Text | Google Scholar

Kember, D., Ho, A., and Hong, C. (2008). The importance of establishing relevance in motivating student learning. Act. Learn. High. Educ. 9, 249–263. doi: 10.1177/1469787408095849

Crossref Full Text | Google Scholar

Martin, F., Wang, C., and Sadaf, A. (2018). Student perception of helpfulness of facilitation strategies that enhance instructor presence, connectedness, engagement and learning in online courses. Int. High. Educ. 37, 52–65. doi: 10.1016/j.iheduc.2018.01.003

Crossref Full Text | Google Scholar

Meltzer, D. E., and Thornton, R. K. (2012). Resource Letter ALIP–1: Active-Learning Instruction in Physics. Am. J. Phys. 80, 478–496. doi: 10.1119/1.3678299

Crossref Full Text | Google Scholar

Müller, C., and Mildenberger, T. (2021). Facilitating flexible learning by replacing classroom time with an online learning environment: A systematic review of blended learning in higher education. Educ. Res. Rev. 34:100394. doi: 10.1016/j.edurev.2021.100394

Crossref Full Text | Google Scholar

Oakley, B., and Sejnowski, T. J. (2021). Uncommon sense teaching: Practical insights in brain science to help students learn. New York, NY: TarcherPerigee.

Google Scholar

Olsson, J., and Granberg, C. (2019). Dynamic software, task solving with or without guidelines, and learning outcomes. Technol. Knowledge Learn. 24, 419–436. doi: 10.1007/s10758-018-9352-5

Crossref Full Text | Google Scholar

Pappas, I. O., and Giannakos, M. N. (2021). Rethinking learning design in IT education during a pandemic. Front. Educ. 6:652856. doi: 10.3389/feduc.2021.652856

Crossref Full Text | Google Scholar

Poon, J. (2013). An examination of a blended learning approach in the teaching of economics to property and construction students. Prop. Manag. 31, 39–54. doi: 10.1108/02637471311295405

Crossref Full Text | Google Scholar

Pozas, M., Löffler, P., Schnotz, W., and Kauertz, A. (2020). The effects of context-based problem-solving tasks on students’ interest and metacognitive experiences. Open Educ. Stud. 2, 112–125. doi: 10.1515/edu-2020-0118

Crossref Full Text | Google Scholar

Race, P. (2010). Making learning happen. Thousand Oaks, CA: SAGE Publication Ltd.

Google Scholar

Rahm, A.-K., Töllner, M., Hubert, M. O., Klein, K., Wehling, C., and Sauer, T. (2021). Effects of realistic e-learning cases on students’ learning motivation during COVID-19. PLoS One 16:e0249425. doi: 10.1371/journal.pone.0249425

PubMed Abstract | Crossref Full Text | Google Scholar

Rooij, S. W. V. (2009). Scaffolding project-based learning with the project management body of knowledge (PMBOK§). Comp. Educ. 52, 210–219. doi: 10.1016/j.compedu.2008.07.012

Crossref Full Text | Google Scholar

Shea, P., Richardson, J., and Swan, K. (2022). Building bridges to advance the community of inquiry framework for online learning. Educ. Psychol. 57, 148–161. doi: 10.1080/00461520.2022.2089989

Crossref Full Text | Google Scholar

Singh, J., Steele, K., and Singh, L. (2021). Combining the best of online and face-to-face learning: Hybrid and blended learning approach for COVID-19, post vaccine, & post-pandemic world. J. Educ. Technol. Syst. 50, 140–171. doi: 10.1177/00472395211047865

Crossref Full Text | Google Scholar

Smith, K. A. (1996). Cooperative learning: Making “groupwork” work. New Direct. Teach. Learn. 1996, 71–82. doi: 10.1002/tl.37219966709

Crossref Full Text | Google Scholar

Soetanto, D., and MacDonald, M. (2017). Group work and the change of obstacles over time: The influence of learning style and group composition. Act. Learn. High. Educ. 18, 99–113. doi: 10.1177/1469787417707613

Crossref Full Text | Google Scholar

Stanèiæ, M. (2021). Peer assessment as a learning and self-assessment tool: A look inside the black box. Assess. Eval. High. Educ. 46, 852–864. doi: 10.1080/02602938.2020.1828267

Crossref Full Text | Google Scholar

Sun, Z., and Xie, K. (2020). How do students prepare in the pre-class setting of a flipped undergraduate math course? A latent profile analysis of learning behavior and the impact of achievement goals. Int. High. Educ. 46:100731. doi: 10.1016/j.iheduc.2020.100731

Crossref Full Text | Google Scholar

Thompson, S. D., Martin, L., Richards, L., and Branson, D. (2003). Assessing critical thinking and problem solving using a Web-based curriculum for students. Int. High. Educ. 6, 185–191. doi: 10.1016/S1096-7516(03)00024-1

Crossref Full Text | Google Scholar

Vygotsky, L. S., and Cole, M. (1978). Mind in society: Development of higher psychological processes. Cambridge, MA: Harvard university press.

Google Scholar

Willey, K., and Gardner, A. (2010). Investigating the capacity of self and peer assessment activities to engage students and promote learning. Eur. J. Eng. Educ. 35, 429–443. doi: 10.1080/03043797.2010.490577

Crossref Full Text | Google Scholar

Wright, M. C., Bergom, I., and Bartholomew, T. (2019). Decreased class size, increased active learning? Intended and enacted teaching strategies in smaller classes. Act. Learn. High. Educ. 20, 51–62. doi: 10.1177/1469787417735607

Crossref Full Text | Google Scholar

Wu, T.-T., and Wu, Y.-T. (2020). Applying project-based learning and SCAMPER teaching strategies in engineering education to explore the influence of creativity on cognition, personal motivation, and personality traits. Think. Skills Creat. 35:100631. doi: 10.1016/j.tsc.2020.100631

Crossref Full Text | Google Scholar

Keywords: blended learning, online assessment, learning outcome, interactive design, group work, critical reflection, feedback

Citation: Awidi IT, Harper T and Savat D (2025) Using blended and online learning to increase appreciation of learning outcomes: case of a problematic game design unit. Front. Educ. 10:1555923. doi: 10.3389/feduc.2025.1555923

Received: 06 January 2025; Accepted: 21 May 2025;
Published: 18 August 2025.

Edited by:

Alfonso Garcia De La Vega, Autonomous University of Madrid, Spain

Reviewed by:

Enrique H. Riquelme, Temuco Catholic University, Chile
Suzana Azizan, University of Malaya, Malaysia

Copyright © 2025 Awidi, Harper and Savat. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Isaiah T. Awidi, aXNhaWFoLmF3aWRpQHVzcS5lZHUuYXU=

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.