ORIGINAL RESEARCH article

Front. Educ., 24 February 2026

Sec. Assessment, Testing and Applied Measurement

Volume 11 - 2026 | https://doi.org/10.3389/feduc.2026.1763058

Mathematics and science teachers perspectives on utilizing the Canvas LMS to enhance assessment for learning

  • Oranim Academic College, Kiryat Tiv’on, Israel

Article metrics

View details

317

Views

23

Downloads

Abstract

In today’s digital learning environments, Learning Management Systems (LMS) such as Canvas play a central role in organizing assessment data, facilitating feedback, and supporting effective teaching practices. This study explored how 18 science and mathematics teachers in a U.S. public school district perceive and use Canvas to support Assessment for Learning (AfL). Data were gathered through semi-structured interviews and teacher-provided Canvas assessment materials. A qualitative approach employing thematic-based analysis was applied. Findings revealed that teachers used assessments as “snapshots” to monitor student understanding and applied blended formative–summative strategies. Auto-graded quizzes with immediate feedback helped identify learning gaps, guide instruction, and promote student reflection. Teachers viewed Canvas as a practical, reliable, and user-friendly instructional tool. The study provides valuable insights into how LMS platforms can enhance AfL practices and clearly underscores the importance of incorporating mathematics and science teachers’ perspectives when shaping educational policies and professional development in digital assessment.

1 Introduction

Technological advancements have transformed education in the 21st century by enabling learning anywhere and anytime and fostering personalized, active learning (Erdoğan Coşkun, 2022).

A learning management system (LMS) is a web-based technology platform used to plan, deliver, and manage educational courses, typically integrating learning materials, communication, and assessment workflows in one environment (Kamath et al., 2025). Canvas is a cloud-based LMS adopted across K–12 school districts and higher education institutions, supporting course delivery and assessment-related workflows within a structured online course space (Marachi and Quill, 2020).

Given the essential role of assessment in shaping learning experiences and outcomes (Schellekens et al., 2021), it is crucial to explore how LMS platforms such as Canvas influence assessment practices. These platforms offer tools for automated assessment, real-time feedback, and ongoing interaction management (Burrack and Thompson, 2021). Although research has explored digital assessment tools for distance learning post-COVID-19, limited attention has been given to mathematics and science teachers’ practices and perceptions of using LMS platforms for assessment (Vlachopoulos and Makri, 2024).

In mathematics, assessing proofs digitally poses distinctive challenges, as automated systems often fail to capture logical coherence, depth of reasoning, or creative argumentation (Bickerton and Sangwin, 2022). In science education, parallel difficulties might emerge when evaluating inquiry-based work, which requires nuanced human judgment that existing digital tools only partially support (Chen and Chen, 2025). While Canvas LMS offers numerous assessment tools, their pedagogical value depends on teachers’ integration and enactment.

Teachers’ interpretations and decisions determine whether LMS assessment remains merely summative or becomes a formative process that builds reasoning and metacognition in mathematics and science classrooms. Thus, understanding teachers’ perceptions is essential. Accordingly, this study examines how teachers perceive and use Canvas LMS to implement assessment for learning (AfL). It contributes to the field by examining the conceptual foundations of AfL, its practical application through Canvas, and the perceived benefits and limitations of the platform in enhancing formative assessment practices.

AfL is a pedagogical approach that actively supports learning through ongoing feedback, reflection, and student engagement. In contrast, assessment of learning (AoL) typically refers to summative assessments used to evaluate achievement at the end of a learning period and is often implemented as one-size-fits-all (Miedijensky and Tal, 2016). Although sometimes positioned as opposites, AfL and AoL can be complementary; evidence generated through AoL can be used formatively to guide subsequent teaching and learning (Lam, 2013). Accordingly, this study adopts a broad AfL definition that includes formative uses of summative assessment evidence.

1.1 Theoretical background

Assessment is a systematic process of gathering, analyzing, and reporting information to improve learning and teaching (Griffin, 2017). In this study, the theoretical background is grounded in a social constructivist view of learning, which conceptualizes knowledge as constructed through interaction, dialogue, and guided participation (Vygotsky, 1978); accordingly, assessment is understood as an integral component of instruction that can mediate students’ meaning-making.

Historically, assessments aimed to categorize performance through summative tests (Shepard, 2000), but the 2002 No Child Left Behind policy contributed to a shift toward integrating assessment with instruction in iterative cycles (Lam, 2015; Shepard, 2024). From a social constructivist perspective, classroom assessment is theorized as a learning-mediating practice: information generated through assessment is interpreted and acted upon to guide subsequent instructional decisions and students’ ongoing meaning-making, rather than functioning solely as a summative judgment (Black and Wiliam, 1998; Shepard, 2000). This theoretical lens underpins AfL, a formative approach that prioritizes supporting learning over ranking performance (Miedijensky and Tal, 2016).

AfL emphasizes timely, constructive feedback that enables both students and teachers to adjust learning and teaching strategies (Schellekens et al., 2021; Van Orman et al., 2024). Consistent with this view, AfL treats feedback and classroom dialogue as key mediational tools through which learners are scaffolded to reflect on evidence of their thinking and take next steps in learning (Black and Wiliam, 1998; Vygotsky, 1978).

AfL is especially critical in mathematics and science, where knowledge develops cumulatively. Ongoing feedback enables students and teachers to identify misconceptions early, seek support, address learning gaps, and move beyond binary “right or wrong” judgments toward deeper conceptual understanding through collaborative teacher–student interactions (Van Orman et al., 2024). In science education, AfL enables teachers to diagnose Zone of Proximal Development (ZPD)-aligned gaps and recalibrate scaffolding in real time, a process that can support students’ science achievement and conceptual change (Ash and Levitt, 2003; Shavelson et al., 2008). In mathematics education, AfL also contributes to the development of students’ mathematical identities and supports equity-oriented classroom goals (Heritage and Wylie, 2018).

To adopt AfL, schools must integrate its concepts, processes, and tools into routine teaching (Andrade and Cizek, 2010). LMS platforms are pivotal, enabling content creation and management, continuous progress tracking, and communication (Attard and Holmes, 2020). LMS support STEM education by integrating visual aids, gamification, and self-assessment to clarify concepts and motivate learners (Desai et al., 2023; Le and Lo, 2022). It also strengthens deliberate practice and procedural fluency through immediate, step-by-step feedback that improves real-time self-regulation and problem decomposition (Barana et al., 2021; Qushem et al., 2022; Sofroniou et al., 2025). LMS, when integrated with learning analytics, enhance STEM education by enabling instructors to monitor student engagement and tailor instructional strategies based on real-time data. This data-driven approach supports personalized learning, improves retention, and fosters evidence-based teaching practices in large-scale and digital STEM environments (Li et al., 2022).

This study focuses on Canvas, a widely used LMS recognized for its user-friendly interface and interactive features (Oudat and Othman, 2024). Canvas supports digital assessment components such as auto-graded quizzes, which provide immediate feedback and allow multiple resubmissions to promote mastery of topics (Lionelle et al., 2023); online project-based learning (OPBL), which enhances metacognitive and problem-solving strategies and peer assessment, fostering collaboration through its peer review feature (Topping, 2017). Additionally, Canvas provides analytic tools that uncover learning habits and assessment performance patterns, enabling personalized feedback and targeted interventions (Oudat and Othman, 2024).

Teachers hold varied perceptions of LMS use. While many cite challenges such as the increased workload of creating customized content, concerns about academic dishonesty, and constant ongoing technical support (Meccawy et al., 2021), others see LMS as an essential tool. They emphasize its effectiveness in streamlining communication with students and parents, which enhances engagement and fosters a calm, focused learning environment. Additionally, educators report LMS supports differentiated instruction and personalized feedback, contributing to improved learning experiences and outcomes (Attard and Holmes, 2020). Frequent and effective use of LMS fosters meaningful interactions between students, peers, and parents, reinforcing the importance of positive teacher perceptions for sustained LMS integration in education (Koh and Kan, 2020).

1.2 Purpose and research questions

The purpose of the present study is to examine how secondary school mathematics and science teachers perceive and enact the use of Canvas LMS to support assessment practices, particularly AfL and to identify the opportunities and constraints they experience when integrating Canvas-based assessment components into mathematics and science instruction. Accordingly, the study addresses the following research questions:

  • How do middle school and high school mathematics and science teachers perceive the role of Canvas LMS in supporting assessment practices, particularly AfL?

  • In what ways do teachers integrate Canvas LMS assessment components into their teaching of mathematics and science?

  • What challenges and opportunities do teachers identify when using Canvas LMS for assessing students’ reasoning, proof construction, and inquiry-based learning processes?

1.3 Research environment and participants

This study investigates the implementation of the Canvas LMS system within a public school district in the United States, which includes two middle schools and one high school (referred to as School A, School B, and School C). In secondary education, AfL enables students’ autonomy, responsibility, and self-regulation skills, which are essential for future success (Charalampous and Darra, 2025). The schools were specifically selected based on district approval, ensuring an impartial selection process without prior affiliations with the researchers. According to the National Center for Education Statistics (National Center for Education Statistics [NCES] and MN Department of Education, 2023), School A enrolled 720 students and employed 34 teachers, School B had 651 students and 38 teachers, and School C had 3,500 students and 171 teachers.

A total of 18 teachers participated in the study, evenly divided between mathematics and science, and selected through purposive sampling (Palinkas et al., 2015). The sample included four science teachers from School A; eight teachers from School B (four mathematics and four science); and six teachers from School C (five mathematics and one science). Recruitment was facilitated by school principals, who shared the research invitation with their faculty. The only eligibility criterion was a minimum of 1 year of experience using the Canvas LMS. Participants had an average of 18 years of teaching experience (SD = 8.7) and an average of 6.3 years of experience using Canvas (SD = 2.3). To protect participants’ anonymity, a coding scheme was used (e.g., IS_XX_DD.M_A), where IS/IM indicates an interview with a science or mathematics teacher, XX represents participant initials, DD.M denotes the interview date, and A/B/C identifies the school.

2 Materials and methods

In this study, the case is bounded by a single public school district and its district-wide implementation of Canvas, with Schools A–C serving as embedded units of analysis. This approach was deemed appropriate for the current study, as the research question examined how teachers perceive and use the Canvas LMS to implement the AfL approach in a specific school district. Because teachers’ AfL enactment through an LMS is highly context-dependent and shaped by local routines, constraints, and tool configurations, a case study design enables analysis of these practices within their natural setting rather than in isolation.

Data were collected primarily through semi-structured interviews, supplemented by participant-provided documents from Canvas as supporting evidence. The inclusion of Canvas artifacts supports triangulation of interview accounts and provides concrete illustrations of how assessment components are configured and enacted in practice (Creswell and Poth, 2018; Yin, 2018).

The interview protocol (see Appendix 1) was structured to cover several domains: general background information (e.g., teaching seniority and experience with Canvas); teachers’ perceptions of assessment and school policy; specific practices related to the use of Canvas for assessment; the integration of instructional activities through Canvas; and user-experience aspects within mathematics and science (e.g., perceived ease of use and subject-specific advantages or disadvantages). The protocol was designed to elicit both descriptive and reflective information about teachers’ practices and attitudes toward LMS-based formative assessment.

Each interview was conducted with the Canvas LMS open, allowing teachers to show concrete examples of their assessments (e.g., a student’s learning journal, in-platform feedback, or a two-way message exchange). These documents were used for triangulation, thereby cross-validating claims from interviews with actual examples of Canvas use and strengthening the case study’s trustworthiness (Creswell and Poth, 2018; Morgan, 2024). Each document was coded using a designated scheme (e.g., VM_XX_EQ1_C). In this scheme, each element represents specific information: the source type and subject (DM: Document-Math, DS: Document-Science, VM: Video-Math, VS: Video-Science), the participant’s initials, the assessment component (e.g., EQ: electronic quiz, OPBL, SA: self-assessment, GAM: game), and the school code (A/B/C).

The data were analyzed using an inductive thematic analysis approach, which allows for the identification, analysis, and reporting of patterns (themes) within the data without the constraints of a pre-existing coding frame or the researcher’s prior theoretical commitments (Braun and Clarke, 2006, 2021). Initially, the researchers immersed themselves in the data to achieve familiarity, followed by the generation of initial codes. These codes were subsequently collated into potential themes, which were then reviewed and refined to ensure they accurately reflected the entire data set. This rigorous process gave rise to two primary themes: (1) mathematics and science teachers’ perceptions of Assessment for Learning (AfL), and (2) mathematics and science teachers’ perceptions of utilizing the Canvas platform for assessment. Additionally, to contextualize and interpret teachers’ accounts of using Canvas for assessment, we drew on Davis’s (1985) conceptualization of usefulness as the extent to which a system enhances job performance and ease of use as the extent to which a system reduces physical and mental effort.

The data analysis was conducted at two levels. At the macro level, each theme mentioned above was divided into categories. The percentage of statements related to each category was calculated based on the total number of statements within that theme (Theme 1: n = 157; Theme 2: n = 260). At the micro level, each category was further broken down into subcategories. To assess the level of agreement among participants, the frequency with which teachers mentioned each subcategory was calculated relative to the total number of participants. Triangulation was employed to ensure the reliability of the findings (Creswell and Poth, 2018; Noble and Heale, 2019). To establish credibility, two researchers independently coded the categories and compared findings. After each researcher had analyzed the data, mutual agreement was obtained between the researchers.

3 Findings

The analysis yielded two main themes, each comprising five categories. Theme 1, Mathematics and Science Teachers’ Perception of AfL, includes: Assessment and its usage; Assessment Methods and Routine of Implementation; Advantages of AfL; Disadvantages of AfL; and District Guidelines on Assessment. Theme 2, Mathematics and Science Teachers’ Perception of Utilizing the Canvas Platform for Assessing Students, includes: Assessment Components that are in Use on Canvas; Facilitating Interaction through Canvas; Users’ experiences with Canvas for conducting assessments; Advantages of Using Canvas for Assessments; and Disadvantages of Using Canvas for Assessments. The sections that follow present these themes and categories in order, supported by illustrative excerpts from participants’ accounts.

3.1 Mathematics and science teachers’ perception of AfL

Five primary categories were delineated from 157 statements about teachers’ perceptions of AfL. Figure 1 illustrates these main categories along with their reference rates.

FIGURE 1

Horizontal bar chart showing reference rates for five assessment categories among 157 respondents: Assessment and its usage at 25 percent, Assessment methods and routine of implementation at 37 percent, Advantages of AfL at 16 percent, Disadvantages of AfL at 1 percent, and District guidelines on assessment at 20 percent.

Categorization of teachers’ perceptions of AfL.

A notable observation from Figure 1 is the disparity between the percentage of references indicating the benefits of AfL and those highlighting its disadvantages. This suggests that most participants acknowledge this assessment method’s significance and advocate for its adoption.

3.1.1 Assessment and its usage

Many teachers emphasized that assessment serves to check students’ understanding at a particular moment or on a specific topic. Based on these recurring descriptions, we conceptualized this pattern as a “snapshot” approach to assessment—an analytic label developed by the researchers to capture the momentary, formative character that teachers attributed to assessment. This shared perception was accompanied by two complementary views on using assessment: one as a tool for students to self-reflect on their understanding and another as a means for teachers to reflect on their instructions and the students’ comprehension status (see Table 1).

TABLE 1

Sub-categoryTypical quoteFrequency of references
Snapshots—assessment means a continuous checkpoint of understanding“Assessment, to me, means looking at where students are at in that particular moment. A way to check in on students’ understanding of a certain content area, in a snapshot in time, like right now, this is what they know.” (IS_SB_14.4.23_A)N = 17
Assessment is an opportunity for teachers to reflect on their instructions.“Formative assessment helps teachers gauge what needs to be covered more in class, what students know or do not know…If students struggle individually, provide them with extra help. If they already know all the information, provide them with an extension. Ultimately, it guides your teaching.” (IS_MS_8.5.23_A)N = 11
Assessment is an opportunity for students to reflect on their learning“Students do very well with opportunities to measure their understanding; And so, I use quizzes as formative assessment, so they can learn and improve their understanding.” (IS_AL_30.5.23_C)N = 10

Sub-categories found under assessment and its usage.

The high agreement among participants from the three schools in treating the subcategory “Assessment and its usage” similarly implies a prevalent perception ingrained at the systemic level regarding AfL. More than half of the teachers utilized terms such as “exit tickets,” “exit slips,” and “warm-ups” to describe various practices for conducting “Snapshots” assessments.

3.1.2 Assessment methods and routine of implementation

Phrases, routines, and methods related to assessment were frequently repeated during the data analysis. Table 2 displays the findings that emerged.

TABLE 2

Sub-categoryTypical quoteFrequency of references
Assessment happens all the time“I assess all the time. While they’re learning, I see who’s doing what…I think it’s constant. You could assess by looking at someone’s face while teaching… So, it’s ongoing, all the time.”)IS_MR_18.4.23_B(N = 7
There are many different ways to assess“I use formative assessment in multiple ways. Usually, I include one formative Canvas quiz for each science unit. But we’re also constantly formative assessing through class discussions, through a piece of paper that has one question on it, through assignments.” (IS_MS_8.5.23_A)N = 12
A hybrid assessment routine of formative and summative
“I think it’s a combination of formative and summative, for sure. Students tend to focus more on summative because that counts toward their grades. However, teachers are constantly using formative assessments and checking in to see where they are at.” (IS_SB_14.4.23_A)N = 17
Students are entitled to multiple attempts to succeed“Students get multiple attempts if they don’t like their score. So, they can go back and retake it and then keep their highest score. It just goes into the formative category.” (IM_AC_7.6.23_C)N = 15
Team assessment“Group work is where they individually look at the problem, think about it, and write down their thoughts. Then, they per up with their neighbor and then share with their team. So, they’re mainly collaborating as a team to work.” (IM_JG_9.6.23_C)N = 7

Sub-category found under assessment methods and routine of implementation.

The frequent use of phrases such as “all the time” and “many different ways,” in addition to the common perception that students are entitled to multiple attempts to succeed after receiving feedback and addressing gaps in their knowledge, suggests a potential integration of AfL principles into organizational discourse and practical implementation.

3.1.3 Advantages of AfL

Over half of the participants discussed their AfL strategies and their positive effects on student motivation, engagement, and learning. Table 3 outlines two main advantages observed by the teachers.

TABLE 3

Sub-categoryTypical quoteFrequency of references
AfL assessments cultivate students’ motivation and engagement“…It’s engaging for students as they receive immediate feedback, which they appreciate. It’s also more motivating since they can instantly see if they got it right or wrong. Even if they don’t see detailed results right away, simply getting their score makes them eager to understand their mistakes.” (IM_AC_7.6.23_C)N = 9
Personal feedback for learning enhancement“One of the coolest things about teaching is having these individual conversations, right? Give student feedback such as, “I loved reading this thing,” …so we don’t necessarily verbally communicate in the classroom, but we are exchanging these notes on Canvas, and that’s super fun.” (IS_AL_30.5.23_C)
“The formative assessment has always been around. But now, it’s used in another way, instead of just guiding the teacher’s instruction as a whole, it helps you single out students that you could pull in and have for an extra period of time.” (IS_MS_8.5.23_A)
N = 10

Sub-category found under the advantage of AfL.

It follows from the advantages mentioned that teachers constantly review and adjust their assessment and feedback methods to help students learn better. One of the teachers in an extended mathematics class designed a whole new assessment based on her observations of students’ struggles, as shown in the following quote:

  • I’ve found that some students have traditionally struggled with mathematics. Taking a 50-point test in 1 day can be overwhelming, especially with low self-esteem. So, I focused on the bare minimum essential skills they need to master independently. I reduced my summative assessment to about half the questions and turned the rest into classwork, adding a few extension questions. What’s been interesting is seeing students take the lead in their team, explaining concepts to others. Yet, in a testing situation, they struggle to think through problems silently and write them down. The test was still graded individually, but I allowed them to use each other as a resource for part of it. This approach helped build confidence, and some of those problems prepared them to do better the next day (IM_SL_9.6.23_C).

The findings highlight that AfL enhances student motivation and engagement and allows teachers to continuously refine their assessment strategies, ensuring a more personalized and effective learning experience.

3.1.4 Disadvantages of AfL

In contrast to the numerous advantages of AfL noted earlier, only two participants identified a disadvantage. They highlighted the difficulty of managing assessment data in a self-paced learning environment. One teacher explained:

  • What was hard about it [AfL] this year is … that kids do different tasks everywhere and turn in stuff daily. So, it was hard to keep track of everybody and what they’re doing. Sometimes, they get lost, and you do not see them. They can look busy, but they are doing nothing. (IS_KB_17.4.23_B)

This challenge underscores the complexity of implementing AfL principles, such as tracking progress and ensuring student engagement, particularly when students are given autonomy through self-paced learning, task choice, and independent work.

3.1.5 District guidelines on assessment

To gauge the commitment to AfL at the systemic level, teachers were queried about any guidelines or standards provided by the district. Table 4 presents the four subcategories identified by the teachers.

TABLE 4

Sub-categoryTypical quoteFrequency of references
Hard on training, easy on the battle“They want to ensure we are giving kids lots of formative opportunities before we get to the summative, I would say that’s the biggest guideline.” (IM_KG_13.6.23_C)N = 7
A structured grading policy that emphasizes summative assessments“We are encouraged to label things correctly, summative, formative, and homework… If stuff is formative, it really should be a 0%. If we are grading, homework, and other activities, our building is at most 20%.” (IM_AC_7.6.23_C)N = 9
No penalty for late submission“They’re not really penalized for doing homework late. I got an email last night, I’m struggling with yesterday’s homework, I don’t really get it.” And even though the homework was due last night, I’d rather have them do it right than do it poorly on time.” (IM_KG_11.4.23_B)N = 5
Autonomy and flexibility in assessment practices“They [The district] say what should be and shouldn’t be graded, but they don’t ask that we do all our assessments on Canvas. It’s pretty open to us in terms of how we want to assess our students.” (IS_DM_4.4.23_A)N = 10

Sub-category found under district guidelines on assessment.

The findings indicate a broad understanding and adoption of AfL concepts, accompanied by a clear expectation that teachers will adhere to them. However, a tension within the assessment policy emerges, as the heavier weighting of summative assessments leads students to perceive them as more “important” than formative tasks. Three teachers noted that this perception, together with the absence of penalties for late submissions (as described in the third subcategory), contributes to an increase in late and missing assignments.

3.2 Mathematics and science teachers’ perception of utilizing the Canvas platform for assessing students

Five primary categories were delineated out of 260 statements dealing with teachers’ perceptions of utilizing the Canvas platform for AfL. Figure 2 illustrates these main categories along with their reference rates.

FIGURE 2

Horizontal bar chart showing five categories related to Canvas assessments, with reference rates as follows: Advantages in using Canvas for assessments 31 percent, user experiences 25 percent, assessment components 20 percent, facilitating interaction 12 percent, and disadvantages 12 percent. Total responses are two hundred sixty.

Categorization of teachers’ perceptions on utilizing the Canvas platform for assessing students.

Figure 2 highlights a significant disparity in participants’ responses, with more than twice as many references to the advantages of using Canvas for assessment (31%) compared to its disadvantages (12%). This suggests that teachers view Canvas as a valuable tool for learning.

3.2.1 Assessment components that are in use on Canvas

Teachers described multiple Canvas-based components that support AfL. Some rely on native tools (e.g., quizzes, modules, Canvas Studio), while others integrate external tools and upload or embed the resulting artifacts. The components mentioned by participants are summarized in Figures 36 and Table 5 provide representative examples that correspond to these sub-categories.

FIGURE 3

Screenshot of an online math quiz displaying question two, which provides two algebraic equations with variables y, x and six corresponding answer boxes labeled a through f, each prompting users to type their answers.

Example for auto-graded quiz (document: VM_JG _EQ1_C).

FIGURE 4

Screenshot of a Science Quest Information Page for Natural Selection, featuring instructions to download a packet and guide, with a stylized banner below depicting a dragon, birds, and the text “Natural Selection Quest.”

Example of gamified assessment (document: DS_DM _GAM1_A).

FIGURE 5

Homework check worksheet with columns labeled 1.1.1, 1.1.3, 1.1.5 plus 1.2.1, 1.2.2, 1.23 plus 1.2.4, 1.2.6, 1.3.1 plus 1.3.2, and Chapter 1 review. Each section features checkboxes for “All problems complete”, “Work shown for all problems,” and a score out of three, except the Chapter 1 review includes “Stars attached”, “Showed sufficient proof of study,” and a score box. Instructions specify scoring based on a 3-2-1-0 rubric for accuracy and completion.

Example for self-assessment (document: DM_JG_SA1_C).

FIGURE 6

Screenshot showing a prototype improvement activity. On the left, typed text explains changing from a plastic to a metal bottle, removing a balloon to reduce heat absorption, and using a manufactured bottle cap to improve efficiency. On the right, two labeled hand-drawn diagrams depict prototype designs. The first sketch shows a bottle with a balloon labeled “chemical reaction.” The second shows a bottle with a bottle cap and aluminum foil labeled “chemical compound.”

Example of students’ OPBL product (document: DS_DM _OPBL2_A).

TABLE 5

Sub-categoryTypical quoteFrequency of references
Auto-graded quizzes“We keep it pretty simple. We give the exit slips, which are numeric response, multiple choice, true-false, matching.” (IM_AC_7.6.23_C)
See Figure 3 for an example of an Auto-graded quiz.
N = 18
Innovative and gamified assessment methods“We call it our science quest. He (one of the teachers) put it together like the role-playing game Dungeons and Dragons. The way he set this up, you always have to start at level one, and then as you make your way through everything, you mark it as done, you do the quiz that goes along with it, and then you move on.” (IS_EG_4.4.23_A)
Figure 4 is an example of a gamified assessment.
N = 8
Self-assessment“I just want to see their self-reflection on how they feel about the test topics before they take it. And so, that’s worth one point.” (IM_JG_9.6.23_C)
Figure 5 is an example of a Self-assessment.
N = 6
Facilitating and assessing students’ learning in science and mathematicsScience
“What we really liked was the built-in rubric on Canvas. Students created photosynthesis models using materials or animations on their iPads. When they submitted their work, we could grade it directly with the rubric, which we found very useful.” (IS_EJ_2.5.23_B)
N = 6
Mathematics
“I had students use Canvas modules to start a math exploration through a quiz-like activity. They chose one of three topics, logarithms, matrices, or summations, allowing them to extend their learning. Each option included videos and simple practice exercises.” (IM_KG_11.4.23_B)
N = 4
Diverse ways in which teachers leverage technology, particularly Canvas, to facilitate various forms of assessment“I have several Canvas Studio videos, including a 7-min one with embedded questions. The quiz pauses to ask questions, ensuring engagement. It’s not meant to be challenging—if students watch, they can answer easily in real-time. It’s sitting alive while watching.” (IS_AL_30.5.23_C)
Figure 6 is an example of a student’s OPBL product
N = 9

Sub-categories found under assessment components that are in Use on Canvas.

3.2.1.1 Auto-graded quizzes

Across subjects, auto-graded quizzes were the most frequently used component for frequent, low-stakes formative checks. Teachers emphasized efficiency gains from features such as question banks, automatic scoring, and rapid item recycling, which facilitate timely feedback and data-informed adjustments to instruction. At the same time, participants noted that fixed-response formats can underrepresent students’ reasoning. To mitigate this limitation, teachers reported requiring students to upload photos of their working process, embedding prompts that surface misconceptions, and including “find-the-error” items within otherwise auto-graded quizzes. As one teacher explained: “Here is something else that we like. Since they cannot show their work, we often put a problem in the auto-graded quizzes with the work done and errors somewhere. Yeah, they need to find the error” (IM_AC_7.6.23_C).

Figure 3 exemplifies this category by showing the structure and affordances of an auto-graded quiz used for formative assessment.

3.2.1.2 Innovative and gamified assessment

Several teachers described narrative, level-based “quests” that interleave practice with mastery checks to sustain engagement and scaffold students’ progression. These designs operationalize AfL cycles (practice → feedback → retry) and make learning trajectories more visible to learners. Figure 4 illustrates a gamified sequence aligned with this approach, where students advance through levels, mark tasks as complete, and encounter embedded checks for understanding.

3.2.1.3 Self-assessment

Teachers implemented brief reflective check-ins to prime metacognitive monitoring. These lightweight prompts (e.g., a one-point pre-test reflection on topic readiness) help students articulate perceived strengths and gaps and enable teachers to tailor feedback and support. Figure 5 depicts a representative self-assessment artifact captured within Canvas.

3.2.1.4 Artifacts for assessing science and mathematics learning; diverse technology-supported assessments

Beyond auto-graded items, teachers gathered richer evidence of student understanding through products and performances submitted in Canvas. In science, teachers emphasized rubric-supported evaluation of models, investigations, and media-based submissions. In mathematics, they described module-based explorations supplemented by targeted practice and artifacts that capture students’ processes (e.g., uploads showing their work). Teachers also referenced Canvas Studio videos with embedded questions that provide just-in-time checks of comprehension. Figure 6 presents a student OPBL product exemplar, illustrating how such artifacts operate within formative cycles to document learning and guide feedback.

3.2.2 Facilitating Interaction through Canvas

Providing feedback and fostering ongoing teacher–student interaction are central to AfL. Table 6 illustrates how teachers use Canvas to support these practices.

TABLE 6

Sub-categoryTypical quoteFrequency of references
Two-way feedback“Some students who are more afraid to raise their hand or throw out things in the classroom. are more likely to write me an email” (IM_JR_18.4.23_B).N = 13
Use of a rubric for assessment and feedback“And then they upload that as a PDF, and I grade it using a rubric. So, they get feedback. And I write comments off on the side over here.” (IS_AL_30.5.23_C)N = 7
Use of feedback commentsSo, I will use the comment… for the great part, where if they made up consistent errors, I would say:” “Hey, come see me, let’s look at this.” (IM_JR_18.4.23_B)N = 12

Sub-categories found under facilitating interaction through Canvas.

The data indicates that students and teachers value two-way interaction, with Canvas and its email function serving as key tools for communication. In addition, many teachers emphasized the importance of personal interactions with students, particularly during lessons or during designated times referred to as “my time/our time.” This suggests that although digital tools are valuable, they do not replace in-person interaction; rather, they often prompt or facilitate it, as illustrated in the quote by IM_JR_18.4.23_B presented in Table 6.

3.2.3 The experiences of users with Canvas for conducting assessments

To support the interpretation of users’ experiences with Canvas, we drew on Davis’s (1985) conceptualization of ease of use and usefulness, as detailed in the Method section. These terms served as an interpretive lens for organizing the findings, together with teachers’ reflections on the district’s supporting resources, as presented in Table 7.

TABLE 7

Sub-categoryTypical quoteFrequency of references
Usefulness and ease of use of Canvas“I think it made teaching and specifically assessing a lot more convenient because there may have been opportunities (in the past) where we decided not to assess just because of… You know, the idea of grading 120 things in a short amount of time. And now we just put it on Canvas, they take it, they get their score right away, and we get that feedback as well, and then we can move on quite quickly.” (IS_DM_4.4.23_A)N = 16
“I think Canvas makes it easier for teachers. I created these quizzes 3 years ago and only tweak them occasionally. With summer teaching starting right after school ended, I feel less stressed knowing everything is ready” (IM_JG_9.6.23_C).
“But there are a lot of things I’m doing now that I wasn’t doing 3 years ago, even… and it is intuitive. Once you get the first couple of things down, it’s pretty user-friendly.” (IS_SB_14.4.23_A)
Difficulties in using CanvasI think it creates a lot more work. When it comes to assessment, anything that has multiple choices is easy enough once it’s built. But again, I don’t think multiple choices are always the best way to assess student learning.” (IS_MR_18.4.23_B)N = 15
“But sometimes Canvas gets to be a little more difficult because there’s so many, like pictures and stuff that we’d want on a quiz that sometimes we just move back to the paper.” (IM_JJ_13.4.23_B)
Ongoing technical support and guidance on Canvas enable a positive experience.“We have a great tech person who will come and help whenever we need it. If you send her an email, she’ll say, “What do you need help with? I’ll come and help you.” She also teaches classes and trains people on how to do extra stuff, so that makes it easy.” (IS_KB_17.4.23_B)N = 7

Sub-categories found under the experiences of users with Canvas for conducting assessments.

To better understand the first subcategory presented in Table 7, a further analysis was conducted to identify the key factors contributing to the 16 teachers who perceived Canvas as effective and easy to use. The findings show that 13 teachers emphasized benefits related to facilitating grading and feedback, 12 teachers mentioned the ability to reuse and improve instructional materials, and eight teachers noted a reduction in workload and time demands. In addition, 11 teachers explicitly mentioned the platform’s convenience and ease of use. However, the second subcategory in Table 7 shows that a similar number of teachers also reported negative experiences. Specifically, eight teachers viewed Canvas as ineffective for assessment purposes, and 12 teachers described it as difficult to use. For eight teachers, these experiences were mixed: they described both positive and negative aspects of the platform, indicating that while certain features felt intuitive, others were perceived as complex or challenging. This pattern reflects the dynamic described by participant IS_MR_18.4.23_B in Table 7.

Several factors appear to influence teachers’ experiences with Canvas. The availability of support, mentioned by 7 teachers in the third subcategory of Table 7, played an important role in shaping their perceptions. Teachers’ self-confidence with technology also contributed to differences in experience. Many described themselves as either “old-school” or “tech-savvy.” Those who identified as tech-savvy tended to find Canvas more user-friendly, while teachers who preferred traditional methods reported fewer positive experiences. As one teacher explained, “I really like kind of old school, I suppose. I like to see the justification on paper” (IM_JJ_13.4.23_B). Another factor shaping teachers’ experiences was the impact of the COVID-19 pandemic, which required a sudden shift to remote learning a year before the study. Ten teachers described the pandemic as a turning point in their use of educational technology. Among them, two teachers reported reducing their use of Canvas afterward due to concerns about student technology burnout, while eight teachers said that the experience improved their digital skills and increased their reliance on platforms such as Canvas. As one teacher shared, “I have increasingly turned to Canvas for quizzes, particularly because of COVID” (IM_KG_11.4.23_B).

Taken together, these findings illustrate how teachers’ perceptions of Canvas, as explained through Davis’s (1985) definitions of usefulness and ease of use, are shaped not only by the platform’s features but also by contextual factors including support, technological confidence, and the lasting influence of the pandemic.

3.2.4 Advantages of using Canvas for assessments

During the data analysis, repeated patterns emerged in how teachers described Canvas as a tool contributing to their teaching processes and students’ learning. Table 8 outlines the main benefits identified.

TABLE 8

Sub-categoryTypical quoteFrequency of references
Enhancing mathematics and science learning processes while utilizing assessment-based CanvasScience
“In anatomy, my colleague and I have found that these quizzes are so important to check student understanding that we have decided to make them worth points in the overall gradebook, and students can take them as many times as they want. So, they can end up with 100% in that category. In zoology, there are many memorizations of terms, so it works great.” (IS_AL_30.5.23_C)
N = 5
Mathematics
“A while ago, before we had question banks built (on Canvas), kids would be like, how do I solve this problem? And when we switched it to a question bank, it became, how do I solve this type of problem? And so instead of chasing that one answer, they were looking more for that.”
(IM_KG_11.4.23_B)
N = 7
Immediate feedback“I think Canvas quizzes is super helpful because you can see what areas students didn’t understand. And then you can go back and review those areas further, especially on the formative side of things, where you’re still teaching. It directs your teaching a lot because you still have time before that final test, and so, for those that score low overall, I will specifically invite those students personally into the My-time session.” (IS_MS_8.5.23_A)N = 15
“One-stop”“I find it useful because we moved everything to Canvas. And so, it’s like our central place. And it’s so helpful, especially when a student is absent. I can just tell that student everything that we do is on Canvas. And so, every homework assignment, every classwork, everything, there’s a link on Canvas. And so, they just go on Canvas, click the link, and that’s what we do.” (IM_TE_19.6.23_C)N = 15
Facilitating teamwork and shared resources“One nice thing is that it takes a lot of upfront work to create, but once you have them, it works pretty well. Me and my colleague who teaches anatomy felt very comfortable with being able to take on extra teaching assignment because we had so much already in place.” (IS_AL_30.5.23_C)N = 11
Enabling frequent, tailored, and data-driven assessment“And then I put the formula in there. And then, when they retake it, they retake the same thing. It just generates a new number for them. So, you don’t have to create a new quiz for them only. So, this is my favorite quiz because it’s just totally automatic.” (IM_JG_9.6.23_C)N = 16
“So, with a quiz, I can tie questions to learning targets, so when it auto-grades the quizzes, it also automatically pulls information so that I can kind of review the students’ progress in each learning target, so instead of tracking their grades, it’s tracking this topic.” (IM_JG_9.6.23_C)

Sub-categories found under advantages in using Canvas for assessments.

The high frequency of teachers who emphasized these benefits suggests that Canvas is well-suited for implementing AfL, particularly in mathematics and science, where frequent practice and the internalization of abstract concepts are essential. Moreover, the final quote in the table, from IM_JG_9.6.23_C illustrates how Canvas-based assessment enables quick and accurate identification of students’ progress toward their learning targets. This, in turn, allows both teachers and students to focus their efforts more strategically, increasing the likelihood of achieving learning goals.

3.2.5 Disadvantages of using Canvas for assessments

Alongside the advantages discussed earlier, using technology, especially Canvas, for assessment presents several challenges and difficulties, as shown in Table 9.

TABLE 9

Sub-categoryTypical quoteFrequency of references
There is no substitute for pen and paperScience
“There are a lot of things that you just need paper and be able to write and draw it out, or look at this data table and then write stuff down. So those are the times when it can be a disadvantage. when you sometimes need the hardcopy.” (IS_SB_14.4.23_A)
56%
(N = 5)
Mathematics
“The problem with that was the notation. When you’re trying to write a summation sign or a statistical formula, it’s very complicated. Now, when you look, there are some little typo errors or things that are hard to read, so we really should improve on that. But again, all those things take time.” (IM_AC_7.6.23_C)
“The only downside of using Canvas is that I don’t get to see the students’ work. It’s good for quick, immediate feedback, but sometimes it emphasizes the misconception that the whole goal is to get to the answer, whereas math is not about the answer but the journey.” (IM_JG_9.6.23_C)
100%
(N = 9)
Challenges in preventing cheating on Canvas assessments“I get a little nervous about cheating. I mean, you’re looking at somebody else’s screen, and if it’s just a multiple-choice, that can be a thing. So, there is a little bit of a security thing that I don’t like as much.” (IM_JJ_13.4.23_B)33%
(N = 6)
Students’ focus is disrupted when utilizing digital devices“iPads have been a positive and a negative all at the same time. Every student has a device…we never do any formal training on how you control your gaming, your desire to do things, or your off-task behaviors. Yeah, I watch a game whenever I want to, and I will do that because I’ve got the technology in my head.” (IS_MR_18.4.23_B)22%
(N = 4)

Sub-categories found under disadvantages in using Canvas for assessments.

According to the data in Table 9, assessing mathematics through Canvas is perceived as more challenging than assessing science. Science teachers expressed concern that digital assessments limit students’ ability to process ideas and demonstrate reasoning through physical sketching or writing, whereas mathematics teachers highlighted technical constraints, such as difficulties entering mathematical notation. Teachers from both subjects emphasized that auto-graded exams, which assess only the final “right or wrong” answer, may cause students to focus on the endpoint rather than on their reasoning processes. Some teachers also prefer to teach using pen and paper, reflecting their pedagogical beliefs, as illustrated in the following comment: “But then also there are some kids that probably would do better if it was a piece of paper and a folder, you know, just that physical movement of things” (IS_KB_17.4.23_B). Teachers further expressed concern about their ability to control cheating when digital devices are involved and described strategies to mitigate this, such as using a lockdown browser to restrict web access during assessments or generating unique question sets through random formulas. Together, these insights highlight both the constraints teachers face when assessing mathematics and science digitally and the adaptive measures they employ to maintain assessment quality and integrity.

4 Discussion

This study explored how science and mathematics teachers utilize Canvas LMS to implement AfL strategies. By examining teachers’ perceptions and practices, the research seeks to identify practical applications that can enhance educational technology assessment approaches, ultimately benefiting student learning experience and outcomes.

4.1 Mathematics and science teachers’ perceptions of AfL

The findings reveal that teachers possess a deep conceptual and practical understanding of AfL and recognize the pedagogical value, particularly in its ability to make student learning visible and guide instruction. Supported by district policies, organizational infrastructure, and tools, most of the teachers described assessment as a quick, frequent, and multidimensional checkpoint of understanding. This aligns with previous research (Baartman and Gulikers, 2025; Barana et al., 2021; Miedijensky and Tal, 2016), framing AfL as a process that facilitates student growth by helping them recognize their real-time learning position and next steps.

Moreover, the consistent use of diverse assessment methods, coupled with timely and constructive feedback to students, the promotion of teamwork fostering student involvement, encouragement of self and peer assessment, and adoption of the widely accepted idea that students deserve multiple opportunities for success, all indicate on an actual integration of AfL principles into organizational discourse and practical application, as mentioned by Black and Wiliam (1998), Black et al. (2004), and Arnold and Willis (2023).

Furthermore, teachers perceive more advantages than disadvantages of AfL, demonstrating a robust belief in its effectiveness. This aligns with findings from Kyaruzi et al. (2018) and Wolterinck et al. (2022), who highlighted AfL as a process yielding actionable insights for teaching and learning.

Notably, while summative assessments remain primary determinants of final grades, teachers frequently prioritize formative assessments to enhance learning experiences. This preference supports Schellekens et al.’s (2021) call for integrating formative and summative assessments into a unified framework, benefiting students holistically and fostering organizational growth.

4.2 Mathematics and science teachers’ perception of utilizing the Canvas platform for AfL: advantages and challenges

Regarding utilizing the Canvas platform for student assessment, findings underscore the widespread use of Canvas’s assessment components. A significant finding is that all teachers extensively employ auto-graded quizzes to facilitate AfL. According to most teachers, one of the key advantages of these quizzes is the provision of immediate feedback at the micro level, enabling students to identify and rectify gaps in their understanding.

On a macro level, teachers noted gaining insights into overall class progress, allowing for adjustments to teaching pace, methods, and tailoring support based on individual needs. Both micro and macro aspects align with the advantages of technology-based formative assessment outlined in Gago (2022). In addition, teachers mentioned using features such as question banks, formula-based randomization in math exercises, and collaborative digital workspaces to streamline the AfL process. This reduces the burden of manual grading and increases opportunities for meaningful teacher-student interactions. This finding is consistent with Yao and Xu’s (2023) study, which draws a clear line between teachers feeling overloaded and their motivation to care for and interact with students.

Aside from efficiency, the findings reveal that teachers are cognizant of the downsides of auto-graded quizzes, as they tend to mainly assess knowledge superficially. Hence, they often incorporate a combination of question types. For example, common mistakes or error detection tasks are identified in mathematics. In science, comprehension questions accompanied by previous text readings or interactive video tutorials help students monitor their understanding of content. Additionally, teachers may require students to upload supporting materials such as photos, files, or recordings to showcase their thought processes. In both the studies by Brusilovsky et al. (2023) and Law et al. (2020), focusing on enhancing students’ engagement through online assessment in mathematics, science, and computer science, the importance of reducing the cognitive load to prevent students from resorting to superficial knowledge strategies was emphasized. Suggestions like assessing comprehension in small sections of material and modifying or debugging questions through auto-graded quizzes align with the approaches advocated by the researchers in those studies.

The study found that Canvas was perceived as a useful and easy-to-use tool for implementing AfL practices, consistent with Davis (1985). Teachers reported that Canvas streamlined assessment routines, allowed for real-time monitoring of student progress, and facilitated the delivery of timely feedback. These features were seen as particularly valuable in mathematics and science education, where frequent practice and procedural fluency are essential (Desai et al., 2023; Le and Lo, 2022).

Moreover, teachers noted more than twice the advantages compared to the disadvantages. However, all teachers reported challenges related to limitations in editing equations, lack of graphing tools, and false scoring due to key-sensitive issues when using Canvas assessment for mathematics. These challenges are consistent with the issues of digital assessment in mathematics presented in Drijver’s, 2019 study. Acknowledging that several factors influence teachers’ viewpoints on experience and advantages/disadvantages is essential. Firstly, the length of experience plays a crucial role (Singh and Chakraborty, 2024).

In this study, most teachers had extensive familiarity with Canvas, with 60% reporting 8 years of experience and the remaining 40% averaging 4 years of experience. Secondly, the degree of support and training provided to teachers throughout the implementation is crucial. Šabić et al. (2021) emphasized that this factor is essential in integrating new technology into education. In this study, seven teachers attributed their self-efficacy and confidence in using Canvas to the support and training provided by the district, and another 11 teachers to their collaborative networks, allowing knowledge and resource sharing. Thirdly, the COVID-19 pandemic led to distance learning. Some teachers noted the pandemic as a significant step that accelerated their technological skills and frequency of use, especially with Canvas. In contrast, others became disillusioned with digital learning, saw it as harmful to students, and therefore abandoned it in favor of traditional learning. These background factors, alongside teachers’ self-perceptions as either “Old-school” or “Tech-Savvy,” may influence their experience and utilization of Canvas, as suggested by Gomez et al. (2021), who emphasized that teachers’ beliefs in their technology integration abilities significantly impact classroom technology use and teachers’ attitude toward it.

4.3 Contribution, limitations and conclusions

This study contributes to the growing body of research on digital formative assessment by offering insights into how LMS platforms, specifically Canvas, can support the implementation of AfL practices in mathematics and science education. By exploring teachers’ perspectives, the study sheds light on the perceived benefits and challenges of using Canvas to enhance student learning, providing valuable input for educational policy, professional development, and the design of technology-supported assessment environments.

However, several limitations must be acknowledged. The study was conducted within a single school district and involved a relatively small sample of participants, which may limit the generalizability of the findings. Moreover, the sample included only teachers with experience using Canvas, potentially overlooking the perspectives of educators who are unfamiliar with the platform and the challenges associated with its initial adoption. These constraints highlight the need for broader and more diverse participant samples in future research. Despite these limitations, the findings demonstrate that Canvas effectively supports key AfL strategies, including formative feedback loops, self-paced learning, and gamified task progression. At the same time, the study identifies the platform’s limitations in addressing open-ended, inquiry-based assessment tasks that require nuanced professional judgment. Teachers emphasized the importance of blending digital tools with human oversight to ensure that assessment remains meaningful and pedagogically sound.

Overall, the study underscores the importance of equipping educators with the skills to design balanced, hybrid assessment approaches that leverage LMS capabilities while maintaining flexibility for deeper, qualitative assessment practices. As LMS platforms continue to evolve, future research should examine students’ experiences, cross-contextual implementation, and issues of digital equity. In addition, further studies could explore how artificial intelligence might be integrated into LMS platforms to enhance personalization, feedback, and support for students’ self-regulated learning.

Statements

Data availability statement

The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.

Ethics statement

Research ethics approval was obtained by the Institutional Review Board of Oranim College of Education. An approval number: 91. The studies were conducted in accordance with the local legislation and institutional requirements. The participants provided their written informed consent to participate in this study.

Author contributions

SM: Writing – original draft, Writing – review & editing. HH: Writing – original draft.

Funding

The author(s) declared that financial support was not received for this work and/or its publication.

Acknowledgments

We gratefully acknowledge the support of the participating Minnesota school district and the collaboration of teachers from the participating schools throughout all stages of the study.

Conflict of interest

The author(s) declared that this work was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Generative AI statement

The author(s) declared that generative AI was used in the creation of this manuscript. For language editing only.

Any alternative text (alt text) provided alongside figures in this article has been generated by Frontiers with the support of artificial intelligence and reasonable efforts have been made to ensure accuracy, including review by the authors wherever possible. If you identify any issues, please contact us.

Publisher’s note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

References

  • 1

    AndradeH.CizekG. J. (2010). Handbook of Formative Assessment.New York, NY: Routledge.

  • 2

    ArnoldJ.WillisJ. (2023). From fragmentation to coherence: Student experience of assessment for learning.Aust. Educ. Res.5118491875. 10.1007/s13384-023-00668-y

  • 3

    AshD.LevittK. (2003). Working within the zone of proximal development: Formative assessment as professional development.J. Sci. Teach. Educ.142348. 10.1023/A:1022999406564

  • 4

    AttardC.HolmesK. (2020). “It gives you that sense of hope”: An exploration of technology use to mediate student engagement with mathematics.Heliyon6:e02945. 10.1016/j.heliyon.2019.e02945

  • 5

    BaartmanL. K. J.GulikersJ. T. M. (2025). VET teacher professional development in formative assessment using learning progressions.J. Form. Des. Learn.9136148. 10.1007/s41686-025-00107-4

  • 6

    BaranaA.MarchisioM.SacchetM. (2021). Interactive feedback for learning mathematics in a digital learning environment.Educ. Sci.11:279. 10.3390/educsci11060279

  • 7

    BickertonR. T.SangwinJ. C. (2022). Practical online assessment of mathematical proof.Int. J. Math. Educ. Sci. Technol.5326372660. 10.1080/0020739X.2021.1896813

  • 8

    BlackP.WiliamD. (1998). Assessment and classroom learning.Assess. Educ.5774. 10.1080/0969595980050102

  • 9

    BlackP.HarrisonC.LeeC.MarshallB.WiliamD. (2004). Working inside the black box: Assessment for learning in the classroom.Phi Delta Kappan86821. 10.1177/003172170408600105

  • 10

    BraunV.ClarkeV. (2006). Using thematic analysis in psychology.Qual. Res. Psychol.377101. 10.1191/1478088706qp063oa

  • 11

    BraunV.ClarkeV. (2021). Thematic Analysis: A Practical Guide.New York, NY: SAGE Publications.

  • 12

    BrusilovskyP.EricsonB. J.HorstmannC. S.ServinC.VahidF.ZillesC. (2023). “The future of computing education materials,” in Proceedings of the CS2023: ACM/IEEE-CS/AAAI Computer Science Curricula, Curricula Practices (New York, NY: ACM), 18.

  • 13

    BurrackF.ThompsonD. (2021). Canvas (LMS) as a means for effective student learning assessment across an institution of higher education.J. Assess. High. Educ.2119. 10.32473/jahe.v2i1.125129

  • 14

    CharalampousA.DarraM. (2025). The contribution of teacher feedback in enhancing students’ cognitive skills in secondary education: A review of research, proposals, and future directions.Eur. J. Educ. Stud.1277135. 10.46827/ejes.v12i4.5899

  • 15

    ChenF.ChenG. (2025). Technology-enhanced collaborative inquiry in K–12 classrooms: A systematic review of empirical studies.Sci. Educ.3417311773. 10.1007/s11191-024-00538-8

  • 16

    CreswellJ. W.PothC. N. (2018). Qualitative Inquiry and Research Design: Choosing Among Five Approaches, 4th Edn. Los Angeles, CA: Sage.

  • 17

    DavisF. D. (1985). A Technology Acceptance Model for Empirically Testing New End-User Information Systems: Theory and Results.Ph. D. thesis. Cambridge, MA: Massachusetts Institute of Technology

  • 18

    DesaiR.RaiN.KarekarJ. (2023). Optimum use of LMS for dynamic mathematics classrooms in blended mode.J. Eng. Educ. Transform.36492499. 10.16920/jeet/2023/v36is2/23075

  • 19

    DrijversP. (2019). Digital assessment of mathematics: Opportunities, issues, and criteria.Mes. Éval. Éduc.414166. 10.7202/1055896ar

  • 20

    Erdoğan CoşkunA. (2022). “Conceptions of society and education paradigm in the twenty-first century,” in Educational theory in the 21st century Maarif Global Education Series, edsAlpaydınY.DemirliC. (London: Palgrave Macmillan), 141171. 10.1007/978-981-16-9640-4_7

  • 21

    GagoD. A. (2022). Examining the Relationship of School Leadership with Implementation of Technology-Based Formative Assessments and Professional Development (Publication No. 11b676f6106450fb0fd2dc61c5afb3c0). (Ph. D. dissertation), Concordia University, Montreal.

  • 22

    GomezF. C.TrespalaciosJ.HsuY.YangD. (2021). Exploring teachers’ technology integration self-efficacy through the 2017 ISTE standards.TechTrends66159171. 10.1007/s11528-021-00639-z

  • 23

    GriffinP. (2017). Assessment for Teaching.Cambridge: Cambridge University Press.

  • 24

    HeritageM.WylieC. (2018). Reaping the benefits of assessment for learning: Achievement, identity, and equity.ZDM Math. Educ.50729741. 10.1007/s11858-018-0943-3

  • 25

    KamathS. G.NayakK. R.NayakV.VermaS. (2025). Leveraging learning management systems in medical education: A scoping review of use, outcomes, and improvement pathways.Med. Educ. Online30:2603805. 10.1080/10872981.2025.2603805

  • 26

    KohJ. H. L.KanR. Y. P. (2020). Perceptions of learning management system quality, satisfaction, and usage: Differences among students of the arts.Aust. J. Educ. Technol.362640. 10.14742/ajet.5187

  • 27

    KyaruziF.StrijbosJ.UferS.BrownG. (2018). Teacher AfL perceptions and feedback practices in mathematics education among secondary schools in Tanzania.Stud. Educ. Eval.5919. 10.1016/j.stueduc.2018.01.004

  • 28

    LamR. (2013). Formative use of summative tests: Using test preparation to promote performance and self-regulation.Asia-Pacific Edu. Res.226978. 10.1007/s40299-012-0026-0

  • 29

    LamR. (2015). Assessment as learning: Examining a cycle of teaching, learning, and assessment of writing in the portfolio-based classroom.Stud. High. Educ.4119001917. 10.1080/03075079.2014.999317

  • 30

    LawY. K.TobinR. W.WilsonN. R.BrandonL. A. (2020). Improving student success by incorporating Instant-Feedback questions and increased proctoring in online science and mathematics courses.J. Teach. Learn. Technol.96478. 10.14434/jotlt.v9i1.29169

  • 31

    LeK.LoC. K. (2022). Flipped classroom and gamification approach: Its impact on performance and academic commitment.Sustainability14:5428. 10.3390/su14095428

  • 32

    LiC.HerbertN.YeomS.MontgomeryJ. (2022). Retention factors in STEM education identified using learning analytics: A systematic review.Educ. Sci.12:781. 10.3390/educsci12110781

  • 33

    LionelleA.GhoshS.MoraesM.WinickT.NielsenL. (2023). “A flexibleformative/summative grading system for large courses,” in Proceedings of the 2023 54th ACM Technical Symposium on Computer Science Education, Toronto, ON, 624630. 10.1145/3545945.3569810

  • 34

    MarachiR.QuillL. (2020). The case of Canvas: Longitudinal datafication through learning management systems.Teach. High. Educ.25418434. 10.1080/13562517.2020.1739641

  • 35

    MeccawyZ.MeccawyM.AlsobhiA. (2021). Assessment in ‘survival mode’: Student and faculty perceptions of online assessment practices in HE during Covid-19 pandemic.Int. J. Educ. Integr.17:16. 10.1007/s40979-021-00083-9

  • 36

    MiedijenskyS.TalT. (2016). Reflection and assessment for learning in enrichment science courses for the gifted.Stud. Educ. Eval.50, 113. 10.1016/j.stueduc.2016.05.001

  • 37

    MorganH. (2024). Using triangulation and crystallization to make qualitative studies trustworthy and rigorous.Qual. Rep.2918441856. 10.46743/2160-3715/2024.6071

  • 38

    National Center for Education Statistics [NCES] and MN Department of Education (2023). Quick Stats. Available online at: https://www.publicschoolreview.com/(accessed April 26, 2023).

  • 39

    NobleH.HealeR. (2019). Triangulation in research, with examples.Evid. Based Nurs.226768. 10.1136/ebnurs-2019-103145

  • 40

    OudatQ.OthmanM. (2024). Embracing digital learning: Benefits and challenges of using Canvas in education.J. Nurs. Educ. Pract.14:39. 10.5430/jnep.v14n10p39

  • 41

    PalinkasL. A.HorwitzS. M.GreenC. A.WisdomJ. P.DuanN.HoagwoodK. (2015). Purposeful sampling for qualitative data collection and analysis in mixed method implementation research.Adm. Policy Ment. Health42533544. 10.1007/s10488-013-0528-y

  • 42

    QushemU. B.ChristopoulosA.LaaksoM.-J. (2022). Learning management system analytics on arithmetic fluency performance: A skill development case in K6 education.Multimodal Technol. Interact.6:61. 10.3390/mti6080061

  • 43

    ŠabićJ.BaranovićB.RogošićS. (2021). Teachers’ self-efficacy for using information and communication technology: The interaction effect of gender and age.Inform. Educ.21353373. 10.15388/infedu.2022.11

  • 44

    SchellekensL. H.BokH. G.de JongL. H.Van der SchaafM. F.KremerW. D.Van der VleutenC. P. (2021). A scoping review on the notions of assessment as learning (AaL), assessment for learning (AfL), and assessment of learning (AoL).Stud. Educ. Eval.71:101094. 10.1016/j.stueduc.2021.101094

  • 45

    ShavelsonR. J.YoungD. B.AyalaC. C.BrandonP. R.FurtakE. M.Ruiz-PrimoM. A.et al (2008). On the impact of curriculum-embedded formative assessment on learning: A collaboration between curriculum and assessment developers.Appl. Meas. Educ.21295314. 10.1080/08957340802347647

  • 46

    ShepardL. A. (2000). The role of assessment in a learning culture.Educ. Res.29414.

  • 47

    ShepardL. A. (2024). What should psychometricians know about the history of testing and testing policy?Educ. Meas.434661. 10.1111/emip.12650

  • 48

    SinghS.ChakrabortyA. (2024). Educators’ learning experiences and intention to use online learning management systems’ platforms: A perceptual study.Serbian J. Manage.19319337. 10.5937/sjm19-45512

  • 49

    SofroniouA.PatelM. H.PremnathB.WallJ. (2025). Advancing conceptual understanding: A meta-analysis on the impact of digital technologies in higher education mathematics.Educ. Sci.15:1544. 10.3390/educsci15111544

  • 50

    ToppingK. (2017). Peer assessment: Learning by judging and discussing the work of other learners.Interdiscip. Educ. Psychol.1117. 10.31532/InterdiscipEducPsychol.1.1.007

  • 51

    Van OrmanD. S. J.GotchC. M.CarbonneauK. J. (2024). Preparing teacher candidates to assess for learning: A systematic review.Rev. Educ. Res.93137. 10.3102/00346543241233015

  • 52

    VlachopoulosD.MakriA. (2024). A systematic literature review on authentic assessment in higher education: Best practices for the development of 21st century skills, and policy considerations.Stud. Educ. Eval.83:101425. 10.1016/j.stueduc.2024.101425

  • 53

    VygotskyL. S. (1978). Mind in Society: The Development of Higher Psychological Processes.Cambridge: Harvard University Press.

  • 54

    WolterinckC.PoortmanC. L.SchildkampK.VisscherA. J. (2022). Assessment for learning: Developing the required teacher competencies.Eur. J. Teach. Educ.47711729. 10.1080/02619768.2022.2124912

  • 55

    YaoY.XuJ. (2023). Occupational stress of elementary school teachers after eased COVID-19 restrictions: A qualitative study from China.Front. Psychol.14:1183100. 10.3389/fpsyg.2023.1183100

  • 56

    YinR. K. (2018). Case Study Research and Applications: Design and Methods, 4th Edn. Los Angeles, CA: Sage.

Appendix

Appendix 1 | Interview protocol.

  • General questions:

    • a. 

        What is your role in school?

    • b. 

        What is your seniority in the Teaching profession? (1, 1–5 years, over 5 years)

      • i. 

          In the subject you’re teaching.

      • ii. 

          In the district.

    • c. 

        How many years of experience do you have in working with Canvas? In other digital assessments?

    • d. 

        In which framework did you receive training? (e.g., pre-service, professional development).

  • What does assessment mean to you? Is there a certain kind of assessment most teachers perform in your school? (e.g., Formative/Summative).

  • Is there a school standard and guideline for using assessments on Canvas LMS?

  • Who are the associates involved in the assessment in-school and ex-school? How does Canvas LMS affect this association? (e.g., student, other teachers, parents, principals, advisor, district coordinators).

  • Which assessment components do you use in your teaching (while using Canvas)? Would you be able to provide examples? (e.g., E-portfolio, OPBL, feedback, peer assessment, self-assessment, observation, concept map, etc.)

    • a. 

        Does the Canvas LMS platform affect the way you perform assessments? If so, please describe how? (e.g., type, frequency, timing).

  • How would you describe the teacher-student interaction through Canvas LMS?

    • a. 

        How you provide feedback to your students on Canvas? And what is usually the content of the feedback?

    • b. 

        Following the given feedback, Is there a two-way dialog? (e.g., Do you promote Questions to students, can they replay?) would you be able to provide examples?

  • Do you find Canvas LMS useful? Is it user-friendly? (e.g., simplicity, convenience, reduce/increase effort) If not, why not? Can you give examples?

  • What are the advantages and disadvantages of using Canvas for assessment in mathematics/science? Would you please give examples for benefit or limitation/difficulties?

  • Would you agree to send me documented examples without student’s name including assessment application (OPBL, E-portfolio, peer- assessment, self- assessment, two-way feedback).

Summary

Keywords

assessment for learning, Canvas LMS, digital learning environments, information and communication technologies, learning management systems

Citation

Miedijensky S and Haran H (2026) Mathematics and science teachers perspectives on utilizing the Canvas LMS to enhance assessment for learning. Front. Educ. 11:1763058. doi: 10.3389/feduc.2026.1763058

Received

08 December 2025

Revised

11 February 2026

Accepted

12 February 2026

Published

24 February 2026

Volume

11 - 2026

Edited by

Ana Remesal, University of Barcelona, Spain

Reviewed by

Laxman Luitel, Kathmandu University, Nepal

Celina Sarmiento, Philippine Normal University, Philippines

Updates

Copyright

*Correspondence: Shirley Miedijensky,

Disclaimer

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.

Outline

Figures

Cite article

Copy to clipboard


Export citation file


Share article

Article metrics