REVIEW article

Front. Educ., 24 April 2026

Sec. Digital Learning Innovations

Volume 11 - 2026 | https://doi.org/10.3389/feduc.2026.1744310

Digital technologies in university assessment: a scoping review

  • 1. Departamento de Didáctica, Currículum y Evaluación, Escuela de Educación, Universidad de Concepción, Los Ángeles, Chile

  • 2. Escuela de Pedagogía, Pontificia Universidad Católica de Valparaíso, Valparaíso, Chile

  • 3. Departamento de Psicología, Facultad Ciencias Sociales, Universidad de Concepción, Concepción, Chile

  • 4. Departamento de Teoría, Política y Fundamentos de la Educación, Escuela de Educación, Universidad de Concepción, Los Ángeles, Chile

Abstract

This study presents a scoping review on the use of digital technologies in higher education from 2019 to 2023, aiming to analyse their incorporation into formative processes and their impact on teaching and assessment in higher education. Using the PRISMA ScR methodology, 11 empirical studies were selected from the Web of Science and Scopus databases, following 4 eligibility criteria. The results reveal that digital technologies enhance autonomy and self-regulation in learning, as well as competency-based assessment. The use of tools such as gamification apps, virtual learning environments, and software for creating multimedia content is highlighted. The conclusions emphasize the importance of continuing research in critical areas such as digital inclusion and academic integrity, particularly in the university context.

Introduction

Digital technologies have become a central component of contemporary higher education, exerting a significant influence on teaching, learning, and assessment practices. Broadly defined, digital technologies in education include virtual learning platforms, online assessment applications, Learning Management Systems (LMS), multimedia production tools, and interactive digital resources that enable the storage, processing, and communication of information in digital formats (Mayer, 2024; Selwyn, 2023; Veletsianos, 2020). Their progressive incorporation has been associated with substantial transformations in educational processes, including increased flexibility, diversification of pedagogical strategies, and expanded opportunities for interaction between students and instructors (Bond et al., 2021).

In higher education, the use of these technologies expanded rapidly during the COVID-19 pandemic, when universities were compelled to adopt remote and hybrid teaching modalities and to redesign assessment practices within very short timeframes (Crawford et al., 2020; Dhawan, 2020; Means and Neisler, 2020). This context intensified scholarly debate regarding the role of these technological resources not merely as instrumental supports, but as mediators of key pedagogical processes—particularly those related to the assessment of learning. Within this shift, assessment increasingly moved away from a predominantly summative and certification-oriented function toward a more formative orientation, aimed at supporting learning processes and informing pedagogical decision-making (García-Peñalvo et al., 2020).

From a pedagogical perspective, this reorientation aligns with the assessment for learning (AfL) framework, which conceptualizes assessment as an ongoing process designed to support, regulate, and enhance student learning rather than merely to measure outcomes (Wiliam, 2011). In higher education, AfL has been associated with practices that promote learner self-regulation, systematic formative feedback, and the development of academic and professional competencies. Although these constructs are conceptually distinct, they converge in their emphasis on fostering active, reflective, and progressively autonomous learning.

To ensure conceptual clarity, this review adopts Assessment for Learning (AfL) as its primary analytical framework. Within this study, AfL is understood as a pedagogical approach in which assessment is embedded within the learning process and serves regulatory, feedback-oriented, and student-centred functions (Black and Wiliam, 2009; Wiliam, 2011). In this sense, digital technologies are not conceptualized as neutral tools, but as mediating artefacts that may enable, constrain, or reshape AfL practices depending on their pedagogical integration. This distinction is essential to avoid conflating technology use with pedagogical function, an issue that has been identified in prior research on digital education.

In line with this framework, three key constructs are operationalized in this review. First, digital technologies refer to software, platforms, or digital environments used to support teaching, learning, or assessment processes. Second, assessment for learning refers to formative assessment practices that involve feedback, self-regulation, and active student participation in the evaluation process. Third, assessment practices are understood as structured activities through which evidence of learning is generated, interpreted, and used to inform pedagogical decisions.

Within this conceptual framework, recent literature suggests that digital technologies can play a pedagogically significant but context-dependent role in operationalizing AfL practices in higher education by facilitating access to evaluative information, diversifying forms of learning evidence, enabling timely feedback, and supporting the monitoring of student progress (Merhi and Meisami, 2024; Montenegro-Rueda et al., 2021). Tools such as LMS platforms, interactive assessment applications, digital rubrics, and collaborative virtual environments have been widely used to support self-assessment, peer assessment, and performance-based assessment, particularly in online and hybrid learning contexts (Tenório et al., 2016).

Despite the growing body of research on digital technologies in higher education, the field remains conceptually fragmented. In particular, there is limited clarity regarding how digital technologies have been pedagogically integrated specifically into assessment for learning processes as opposed to being described in purely instrumental or technological terms. Furthermore, existing studies tend to prioritize perceptions and reported experiences over theoretically grounded analyses of assessment practices, resulting in a lack of coherence between pedagogical constructs and empirical evidence. Previous studies indicate that a substantial proportion of the available research relies on descriptive designs focused primarily on student and teacher perceptions, which limits the ability to establish causal relationships or to evaluate long-term effects on competency development (Escorcia-Guzmán et al., 2022; Saldaña and González, 2022). Moreover, the rapid expansion of digital technologies has raised concerns related to digital inclusion and academic integrity, particularly in online assessment contexts, where unequal access and monitoring challenges may exacerbate existing educational inequalities (Czerniewicz et al., 2020; Khan et al., 2021).

This gap is especially relevant in higher education, where assessment plays a central role in shaping learning trajectories and academic development. Therefore, a structured mapping of how digital technologies are conceptualized and operationalized within AfL-oriented practices is needed. Scoping reviews are particularly well suited to this purpose, as they enable the mapping of an emerging and heterogeneous research field, the identification of trends and knowledge gaps, and the clarification of conceptual and methodological patterns without seeking to establish causal effectiveness (Tricco et al., 2018).

Guided by this rationale, the present study addresses the following research question: how have digital technologies been integrated into assessment for learning processes in higher education, according to empirical evidence published between 2019 and 2023? Accordingly, the objective of this scoping review is to analyse how digital technologies have been incorporated into assessment for learning processes in higher education through a synthesis of empirical studies published between 2019 and 2023, with the aim of identifying predominant pedagogical approaches, assessment strategies, types of technologies employed, and the methodological scope and limitations of the available evidence.

Method

This study followed a scoping review design, conducted in accordance with the PRISMA Extension for Scoping Reviews (PRISMA-ScR) guidelines (Tricco et al., 2018). A scoping review approach was selected because the objective of the study was not to determine the effectiveness of specific interventions or to establish causal relationships, but rather to map existing empirical evidence, identify dominant pedagogical approaches, characterize the types of digital technologies used in assessment for learning processes in higher education, and highlight methodological trends and research gaps within the field.

The review protocol was not registered in PROSPERO, as this platform does not accept registrations for scoping reviews, literature mappings, or narrative reviews. The PRISMA-ScR checklist applied in this study is provided as supplementary material (Supplementary Appendix A).

Search strategy and information sources

The literature search was conducted between 10 August and 14 December 2023 using two international databases: Web of Science (WoS) and Scopus. A general search syntax was designed using the following terms:

(“Higher Education”) AND (“Assessment”) AND (“Digital Technology”) AND (“Methodology”).

This syntax was adapted to the specific characteristics and filters of each database. The search strategy included the following inclusion criteria:

a. Type of study: empirical research articles;

b. Publication period: January 2019 to July 2023;

c. Language: English or Spanish;

d. Field of knowledge: education and educational research or related disciplines.

The full search strategy for each database is detailed in Supplementary Appendix C.

The search strategy was developed iteratively and by consensus among the research team, with the aim of ensuring coherence with the study objectives, transparency, and reproducibility (Tricco et al., 2018). Relatively broad search terms were intentionally used to capture a diverse body of relevant literature, consistent with the exploratory nature of scoping reviews. This decision was also aligned with the research question, which seeks to understand how a phenomenon has been approached in the literature, rather than to measure effectiveness or compare interventions (Aromataris et al., 2024; Tricco et al., 2018).

Eligibility criteria and study selection

Eligibility criteria are summarized in Table 1. Studies were included if they reported empirical evaluations of the use of digital technologies in teaching, learning, and assessment processes in higher education contexts, involving undergraduate or postgraduate students and/or university instructors. Only studies reporting the use of digital technologies between January 2019 and July 2023 were considered.

Table 1

Eligibility criteriaDescription
1) Use of digital technologies in assessment for learningStudies analysing the integration of digital technologies into assessment for learning processes.
2) PopulationHigher education students and instructors (undergraduate and postgraduate).
3) ContextFace-to-face, online, or hybrid learning modalities.
4) Study designEmpirical studies, including qualitative, quantitative, or mixed-methods designs.

Eligibility criteria.

Studies were excluded if their findings were based exclusively on perceptions of technology use without reference to assessment practices, if they did not involve the use of digital technologies, if they were conducted in school or workplace training contexts, or if they reported technology use outside the defined temporal range.

The selected timeframe (January 2019–July 2023) was an intentional methodological decision, grounded in the objective of capturing empirical evidence related to pre-pandemic, pandemic, and immediate post-pandemic phases of higher education. This period allows for the examination of how digital technologies were adopted, adapted, and re-signified in assessment for learning processes during a phase of accelerated digital transformation. Although extending the timeframe beyond July 2023 could have captured a growing body of research on artificial intelligence in higher education, doing so would have shifted the focus toward a more longitudinal or technology-specific review, which fell outside the scope of the present study (Aromataris et al., 2024; Tricco et al., 2018).

The disciplinary and educational level delimitations of this study were intentional methodological decisions that respond directly to the objective of this scoping review, which seeks to map the use of digital technologies in assessment for learning processes. Specifically, the review aims to identify predominant pedagogical approaches, types of technologies employed, methodological scope, and limitations of the available evidence within the context of higher education. Given the breadth and inherent complexity of education as a disciplinary field, a clearly defined scope was necessary to ensure analytical coherence and to avoid dispersion of results. In this sense, restricting the review to the educational domain allowed for a more focused and conceptually consistent analysis of digitally mediated assessment practices. As noted by Peters et al. (2020), scoping reviews may be legitimately restricted to a specific disciplinary domain, even when addressing phenomena of an interdisciplinary nature.

Similarly, the delimitation of this review to higher education was grounded in both conceptual and methodological considerations. Focusing on university contexts aligns with the objective of examining assessment for learning practices within a level where assessment plays a central role in shaping learning trajectories and academic development. Consequently, studies conducted in primary, secondary, preschool, or workplace training settings were excluded, as their inclusion could have introduced conceptual and analytical dispersion, limiting the depth and rigor of the analysis. Overall, these delimitations are consistent with the methodological guidance provided by Peters et al. (2020), who emphasize the importance of clearly defining the scope of a scoping review and justifying the exclusions applied in order to ensure coherence, transparency, and analytical depth.

Study selection process

After removing duplicate records using Mendeley reference management software, a total of 81 studies were retained for screening. These studies were distributed evenly among the six researchers comprising the review team. Each researcher independently screened titles, keywords, and abstracts according to the predefined inclusion and exclusion criteria. To support methodological rigor and consistency, the research team first developed a shared screening protocol and an initial data charting matrix. This matrix captured information related to population, concept, context, temporal scope, and research design (see Supplementary Appendix C).

Following the initial screening, the research team held a consensus meeting to review and discuss preliminary selections. In cases where uncertainty existed regarding eligibility, studies were retrieved and assessed in full text. Disagreements were resolved through team-based discussion until consensus was reached. All studies meeting the eligibility criteria were subsequently included in the final analysis (Peters et al., 2020).

Data charting and extraction

A second data charting matrix was developed to systematize information extracted from the full texts of the included studies. This matrix included variables such as author(s), year of publication, research design, study objectives, population, digital technologies used, and principal findings. Data charting was conducted independently by the researchers and subsequently reviewed collectively to ensure clarity, consistency, and methodological validity.

Consistent with the purpose of a scoping review, no formal critical appraisal of study quality was conducted, as the objective was not to synthesize evidence for causal inference or effectiveness, but to map the nature, scope, and characteristics of existing empirical research and to identify gaps in the literature (Aromataris et al., 2024; Tricco et al., 2018). The eligibility criteria are summarized in Table 1 below.

Data synthesis

A narrative synthesis was conducted to summarize and interpret the main findings of the included studies. The synthesis focused on the ways in which digital technologies were integrated into university teaching and assessment practices, particularly in relation to assessment for learning processes during and after the COVID-19 pandemic (January 2019–July 2023).

The general characteristics of the included studies are presented in Table 2, while detailed information regarding study objectives, designs, procedures, and digital technologies employed is provided in Supplementary Appendix B.

Table 2

ArticleFirst author, yearName of the journal of publicationCountry, universityLanguage of publication
1Socrative, a powerful digital tool for enriching the teaching–learning process and promoting interactive learning in Chemistry and Chemical Engineering studiesRoman, 2021Computer Applications in Engineering EducationSpain, Universidad de Huelva.English
2Active learning in history teaching in higher education: The effect of inquiry-based learning and a student response system-based formative assessment in teacher trainingTirado-Olivares, 2021Australasian Journal of Educational TechnologySpain, no se indica Universidad.English
3Evaluación formativa, autorregulación, feedback y herramientas digitales: uso de Socrative en educación superiorFraile, 2021RetosSpain, Universidad Francisco de VitoriaSpanish
4Formation of Universal Competencies of Undergraduates during Development of the Plot of Web-QuestIsupova, 2021European Journal of Contemporary EducationRussia, Universidad Estatal de VyatkaEnglish
5Digital rubric-based assessment of oral presentation competence with technological resources for preservice teachersPérez-Torregrosa, 2022Estudios Sobre EducaciónSpain, Universidad de Málaga y Universidad de GranadaEnglish
6ICT and 360o evaluation: Improving professional skills in higher education in SpainMartínez-Romera, 2022Tuning Journal for Higher EducationSapin, Universidad de DeustoEnglish
7Impact of an Unlimited E-Book Subscription Service and Digital Learning Solution in Management Education at a Minority Serving UniversityBuzzetto-Hollywood, 2022Journal of information technology education: researchUSA, Universidad de HBCUEnglish
8Thai undergraduate science, technology, engineering, arts, and math (STEAM) creative thinking and innovation skill development: a conceptual model using a digital virtual classroom learning environmentWannapiroon, 2022Education and Information TechnologiesThailand, Institute of Technology Ladkrabang (KMITL), Rajamangala University of Technology SuvarnabhumiEnglish
9Formar y evaluar competencias en educación superior: una experiencia sobre inclusión digitalSanz-Benito, 2023RIED. Revista Iberoamericana de Educación a DistanciaSpain, Universitat Rovira i Virgili.Spanish
10HyFlex: Enseñar y aprender de modo híbrido y flexible en la educación superiorArea-Moreira, 2023RIED-Revista Iberoamericana de Educación a DistanciaSpain, Universidad de la LagunaSpanish
11Percepción de docentes y estudiantes de educación superior de los exámenes a libro abierto y supervisados en la pandemia por COVID-19Marcano, 2023Educación XXISpain, Universidad Internacional de la RiojaSpanish

General characteristics of selected studies.

Results

Selection of studies

Following the PRISMA-ScR flow diagram (Figure 1), the study selection process comprised four stages: identification, screening, eligibility, and inclusion. Initially, 101 records were identified across the Web of Science and Scopus databases. After the removal of duplicate records, 81 studies remained for screening. Application of the eligibility criteria resulted in a preselection of 24 articles, of which 19 were retrieved for full-text assessment. Finally, 11 empirical studies met all inclusion criteria and were included in the scoping review.

Figure 1

Context of the included studies

The included studies predominantly report experiences conducted in the Spanish higher education context, particularly within public and private universities. Most studies focus on students and instructors from education-related disciplines, such as Primary Education, Early Childhood Education, Physical Education, History, and Teacher Education programs (Area-Moreira et al., 2023; Fraile et al., 2021; Martínez-Romera and Cortés-Durmont, 2022; Sanz-Benito et al., 2023; Tirado-Olivares et al., 2021). To a lesser extent, studies involve students from STEM fields (Pérez-Torregrosa et al., 2022; Roman et al., 2020; Wannapiroon and Pimdee, 2022), postgraduate programs (Isupova et al., 2021; Marcano et al., 2023), and business or management education (Buzzetto-Hollywood and Thomas-Banks, 2022).

Analytical purposes of the studies

The reviewed studies address a variety of pedagogical and evaluative concerns related to the use of digital technologies in higher education. Some studies focus on evaluating the role of specific applications or digital environments in student or teacher satisfaction, engagement, and perceived learning during remote or hybrid teaching contexts (Buzzetto-Hollywood and Thomas-Banks, 2022; Fraile et al., 2021; Isupova et al., 2021; Roman et al., 2021; Wannapiroon and Pimdee, 2022). Other studies examine the outcomes associated with the implementation of innovative teaching methodologies supported by digital technologies, such as inquiry-based learning, HyFlex models, or competency-based approaches (Area-Moreira et al., 2023; Isupova et al., 2021; Sanz-Benito et al., 2023; Tirado-Olivares et al., 2021).

Additionally, several studies highlight emerging challenges linked to the use of digital technologies in higher education, particularly issues related to digital inclusion and academic integrity (Buzzetto-Hollywood and Thomas-Banks, 2022; Sanz-Benito et al., 2023). Finally, a subset of studies explicitly addresses assessment practices, focusing either on the effectiveness of learning supported by digital technologies or on the role of digital tools as mediators of assessment processes, including feedback, competency-based assessment, and authentic evaluation practices (Marcano et al., 2023; Martínez-Romera and Cortés-Durmont, 2022; Pérez-Torregrosa et al., 2022; Roman et al., 2021).

Research designs

From a methodological perspective, the included studies predominantly adopt an empirical-analytical orientation, employing a range of research designs. Descriptive studies are the most common and typically rely on questionnaires, surveys, and statistical analyses to examine perceptions, experiences, or self-reported outcomes (Area-Moreira et al., 2023; Fraile et al., 2021; Sanz-Benito et al., 2023). Quasi-experimental designs are used in a smaller number of studies to compare instructional approaches supported by digital technologies (Roman et al., 2021; Tirado-Olivares et al., 2021), while one experimental study employs control and experimental groups to evaluate the impact of a WebQuest-based intervention on competency development (Isupova et al., 2021).

A limited number of studies adopt mixed or complementary methodological approaches, including case studies and evaluative designs that combine quantitative measures with qualitative data sources, such as discourse analysis or expert panels (Martínez-Romera and Cortés-Durmont, 2022; Wannapiroon and Pimdee, 2022). Across the corpus, only a small subset of studies reports performance-based outcomes, whereas most focus primarily on perceptions, evaluations, or reported experiences.

Teaching and learning strategies supported by digital technologies

The reviewed studies report the use of diverse pedagogical strategies supported by digital tools. These include approaches aimed at promoting learner autonomy and self-regulation (Area-Moreira et al., 2023; Fraile et al., 2021; Roman et al., 2021), collaborative learning (Isupova et al., 2021; Martínez-Romera and Cortés-Durmont, 2022; Wannapiroon and Pimdee, 2022), and inquiry-based learning (Isupova et al., 2021; Tirado-Olivares et al., 2021). Such strategies are implemented through methodologies such as flipped classrooms, project-based learning, WebQuest activities, asynchronous learning tasks, recorded lectures, and small-group work.

Assessment practices

While the previous section examined teaching and learning strategies supported by digital technologies, the following section focuses specifically on assessment practices, distinguishing evaluative processes from broader instructional approaches and emphasizing their role within assessment for learning (AfL) processes.

Across the reviewed studies, digital technologies were predominantly integrated into formative and progressive assessment practices, many of which correspond to processes commonly associated with assessment for learning (AfL), such as student involvement in assessment, ongoing feedback, and competency-oriented evaluation. Specifically, several studies report the use of digital tools to support self-assessment and peer assessment, enabling students to actively engage in the evaluation of their own learning processes (Fraile et al., 2021; Isupova et al., 2021; Pérez-Torregrosa et al., 2022; Wannapiroon and Pimdee, 2022). Other studies emphasize competency-based assessment approaches, in which digital technologies are used to document learning evidence, monitor progress, and evaluate performance against predefined criteria (Sanz-Benito et al., 2023).

These digital tools are also employed to facilitate feedback processes, particularly through applications that provide immediate or structured feedback, such as student response systems and digital rubrics (Roman et al., 2021; Tirado-Olivares et al., 2021). In addition, some studies describe the use of comprehensive or multidimensional assessment models, including 360-degree evaluation frameworks, supported by digital platforms (Martínez-Romera and Cortés-Durmont, 2022). Overall, the assessment practices reported in the reviewed studies suggest a shift toward more participatory, feedback-oriented, and process-focused forms of assessment, supported by digital technologies in higher education contexts.

Types of digital technologies used

The studies reviewed report the use of a wide range of digital technologies. Gamification applications such as Socrative and Kahoot are frequently employed to support formative assessment (Fraile et al., 2021; Roman et al., 2021; Tirado-Olivares et al., 2021). Learning Management Systems, including Moodle and Cengage platforms, are widely used to organize instructional content, manage assessment activities, and facilitate communication (Area-Moreira et al., 2023; Buzzetto-Hollywood and Thomas-Banks, 2022; Marcano et al., 2023; Sanz-Benito et al., 2023).

Additional technologies include digital rubric tools (e.g., CoRubric), videoconferencing platforms, multimedia content creation applications, e-book systems, and online proctoring tools designed to support assessment integrity in remote contexts (Al-Azawei et al., 2016; Marcano et al., 2023; Martínez-Romera and Cortés-Durmont, 2022).

Discussion

This scoping review highlights a marked heterogeneity in how digital technologies are examined within higher education teaching, learning, and assessment contexts. The reviewed studies approach digital tools from multiple perspectives, ranging from the contribution of specific tools to their role in addressing emerging pedagogical and evaluative challenges. These approaches are typically analysed across three levels: perceptions, usage patterns, and empirical learning outcomes. Similar patterns have been identified in prior research examining the impact of technologically mediated environments in university education (Kumar, 2024; Negi, 2020; Zheng et al., 2018). At the same time, the findings raise ongoing conceptual and empirical questions regarding the extent to which digital technologies meaningfully contribute to formative learning processes (Watermeyer et al., 2021).

However, these findings should be interpreted with caution, as the majority of the reviewed studies rely on descriptive and perception-based designs, which do not allow for strong claims regarding causal impact or learning effectiveness.

A further relevant finding concerns the geographical concentration of the reviewed research. A substantial proportion of the studies originate from Spanish-speaking higher education contexts, particularly Spain and parts of Latin America. This trend is consistent with previous evidence highlighting the prominence of digital education research in Ibero-American settings (López-Nuñez et al., 2024; OECD, 2023). While this concentration reflects regional research priorities and policy agendas, it also points to the need for expanding empirical work into more diverse socio-cultural and institutional contexts, in order to develop comparative and multicultural perspectives that can strengthen the generalizability of findings.

From a methodological perspective, the reviewed studies show a clear predominance of empirical-analytical approaches, with most research relying on descriptive and quasi-experimental designs. These approaches provide valuable, though contextually bounded, insights into patterns of use and reported benefits of digital technologies, but they often offer limited information regarding research instruments, analytical procedures, or the pedagogical assumptions underlying assessment practices. This methodological trend aligns with previous reviews that have identified an overrepresentation of quantitative and perception-based approaches in digital assessment research, alongside a relative scarcity of qualitative and mixed-methods studies (Tahir et al., 2025). Such imbalances may constrain the development of deeper, context-sensitive understandings of how digital technologies interact with assessment practices over time.

From a pedagogical standpoint, the analysed studies consistently indicate that digital technologies are frequently associated with increased learner autonomy, collaboration, and active engagement. These themes recur across research examining the innovative potential of digital technologies in higher education (Hernández et al., 2019; Kumar, 2024; Negi, 2020; Upadhayay et al., 2022). The strategies implemented commonly involve digital platforms and applications designed to support interaction, collaboration, and continuous engagement, suggesting that the pedagogical value of digital technologies is closely tied to the instructional designs within which they are embedded.

In relation to assessment, the findings indicate that digital technologies are frequently examined in connection with practices aligned with assessment for learning (AfL), particularly through their role in supporting feedback processes, progressive assessment, and student involvement in evaluation activities. These practices reflect foundational AfL principles, including the use of assessment to support learning regulation and the active participation of learners in evaluative processes (Black and Wiliam, 2009; Wiliam, 2011). While many studies situate these practices within pedagogical changes prompted by the COVID-19 pandemic (Aguilera-Hermida, 2020; García-Peñalvo et al., 2020; Marinoni et al., 2020; Sun et al., 2020), the reviewed evidence suggests that the relevance of digitally mediated AfL practices extends beyond emergency remote teaching. Prior research has similarly emphasized the potential of digital assessment to enhance feedback quality, learner engagement, and formative regulation of learning (Jain and Alam, 2022; Montenegro-Rueda et al., 2021; Tuah and Naing, 2021).

From this perspective, the findings suggest that the pedagogical value of digital technologies in assessment is contingent upon three interrelated dimensions: (1) the instructional design in which they are embedded, (2) their alignment with assessment for learning (AfL) principles, and (3) the level of student engagement in evaluative processes. These dimensions highlight that the contribution of digital technologies is not inherent to the tools themselves but emerges from their integration within pedagogically grounded assessment practices.

However, several studies also point to persistent tensions associated with digitally mediated assessment, particularly regarding the balance between measuring outcomes and supporting learning processes. Some research continues to focus primarily on performance measurement, often prioritizing results over formative dimensions such as metacognition and student agency (Marín et al., 2025). This tension reflects longstanding debates within assessment research concerning the alignment between evaluative practices and pedagogical intentions, especially in technology-rich environments (Emmanuel Olawale, 2024).

In addition, the reviewed literature highlights emerging challenges related to digital inclusion and academic integrity. The rapid expansion of online and hybrid assessment during the pandemic intensified the use of digital platforms, gamification tools, and interactive environments to support more contextualized and student-centred learning processes (Cicha and Rutecka, 2023; Jain and Alam, 2022; Khajuria et al., 2023). At the same time, unequal access to digital resources and varying levels of digital literacy have raised concerns regarding equitable participation in assessment processes (OECD, 2023). These challenges underscore the need to consider assessment technologies as part of broader socio-technical systems shaped by institutional, cultural, and ethical conditions.

Despite the growing empirical literature on digital technologies in higher education assessment, the findings of this scoping review reveal a lack of consolidated theoretical frameworks capable of systematically guiding pedagogical decision-making in digitally mediated evaluation contexts. While assessment for learning provides a valuable conceptual reference point (Black and Wiliam, 2009; Wiliam, 2011), further theoretical integration is needed to better understand how digital technologies support formative assessment practices across diverse educational settings (Oldfield et al., 2012). Addressing this gap will require future research that combines robust theoretical grounding with diversified methodological approaches, including longitudinal, qualitative, and cross-cultural designs.

The reviewed evidence also points to a notable absence of consolidated pedagogical competence frameworks explicitly guiding teaching and assessment practices in digital higher education contexts. Frameworks such as Technological Pedagogical Content Knowledge (TPACK) or DigCompEdu, which are designed to orient informed and reflective integration of digital technologies into teaching practice, are largely absent from the empirical studies analysed (Díaz Vargas et al., 2025; Gunsha, 2022; López-Nuñez et al., 2024; Silva-Quiroz et al., 2022). The evidence suggests that, in the absence of such pedagogical references, the adoption and incorporation of digital technologies often remain superficial or predominantly instrumental, with limited capacity to generate transformative changes in assessment practices.

Overall, the findings of this scoping review highlight both the diversity and dynamism of research on assessment and digital technologies in higher education, as well as the need for stronger pedagogical reference frameworks and more robust methodological designs. Future research would benefit from more integrated and comprehensive approaches that move beyond descriptive accounts, placing greater emphasis on contextual, ethical, and formative dimensions of digitally mediated assessment. Such an orientation would contribute to a deeper and more coherent understanding of how digital technologies can meaningfully support assessment for learning practices in higher education.

Limitations and future directions

Despite the contributions of this scoping review, several limitations should be acknowledged. First, a major constraint of the reviewed literature is the predominant reliance on descriptive empirical studies, which limits the ability to draw robust conclusions regarding the long-term effectiveness of digital technologies in higher education assessment. Although the selected studies provide valuable insights into patterns of use, perceptions, and reported benefits, most rely heavily on questionnaires and surveys, offering limited evidence of measurable impacts on academic performance or competency development. This methodological tendency has been widely documented in prior research, which highlights persistent challenges related to the generalizability of findings (Escorcia-Guzmán et al., 2022), the limited examination of authentic classroom practices (Saldaña and González, 2022), and insufficient attention to contextual, cultural, and power-related dimensions shaping technology use in education (Castro-Guzmán, 2021; Nolasco and Ojeda, 2016).

A second limitation concerns the restricted methodological diversity of the reviewed studies. The relative scarcity of experimental, quasi-experimental, and mixed-methods designs constrains the capacity to establish causal relationships between digitally mediated assessment practices and learning outcomes. Moreover, the limited use of qualitative approaches restricts deeper understanding of how assessment practices are enacted, negotiated, and experienced by students and instructors in specific institutional and cultural contexts. As a result, many studies offer partial accounts of digitally mediated assessment, focusing on surface-level indicators rather than on the complex pedagogical processes underpinning assessment for learning.

In addition, the review reveals a strong geographical concentration of empirical research conducted in Spanish-speaking higher education contexts, particularly Spain and parts of Latin America. While this reflects important regional research agendas and educational reforms, it also limits the transferability of findings to other higher education systems with different institutional structures, policy environments, and technological infrastructures. This concentration underscores the need for broader cross-cultural and comparative research to enhance the external validity of future studies.

Finally, as this study adopts a scoping review design, it does not include a formal critical appraisal of study quality, nor does it aim to evaluate the effectiveness of specific technologies or assessment interventions. Consequently, the findings should be interpreted as a mapping of research trends and gaps, rather than as evidence of best practices or pedagogical effectiveness.

Future research should therefore prioritize longitudinal and methodologically robust designs that allow for the examination of sustained impacts of digital technologies on learning processes, assessment practices, and competency development over time. Greater integration of mixed-methods approaches would enable researchers to combine quantitative outcome measures with qualitative analyses of classroom practices, learner agency, and feedback dynamics. In addition, future studies would benefit from adopting clear pedagogical and competence-based frameworks to guide the design and evaluation of digitally mediated assessment practices, thereby avoiding purely instrumental uses of technology.

Further research should also expand into diverse institutional, cultural, and disciplinary contexts, incorporating comparative and cross-national perspectives that account for contextual, ethical, and equity-related dimensions of digital assessment. Such approaches would contribute to a more comprehensive and theoretically grounded understanding of how digital technologies can meaningfully support assessment for learning in higher education.

Conclusions

The contribution of digital technologies to teaching, learning, and assessment processes in higher education—particularly since the COVID-19 pandemic—requires careful conceptual and empirical examination that goes beyond assumptions of innovation or effectiveness. In this regard, analysing how previous research has approached the evaluation of digitally mediated assessment practices provides an opportunity to better understand what has been studied, how it has been studied, and under which pedagogical assumptions.

The scoping review conducted in this study contributes to this understanding across several dimensions. First, it offers a contextualized overview of recent empirical research, highlighting the prominence of studies conducted in Spanish-speaking higher education contexts. This finding provides a valuable point of reference for future comparative analyses across different educational systems and institutional settings.

Second, the review illustrates how the pedagogical contribution of digital technologies to assessment practices is closely linked to the teaching and learning strategies in which they are embedded. In particular, the analysed studies show that digital technologies are frequently associated with formative, participatory, and process-oriented assessment practices aligned with assessment for learning (AfL) principles, such as feedback provision, learner involvement, and competency-based assessment. At the same time, the findings suggest that these contributions are not inherent to the technologies themselves but rather depend on the pedagogical designs and conceptual frameworks guiding their use.

Third, the review highlights the methodological characteristics and limitations of the existing empirical evidence. The predominance of descriptive and perception-based studies constrains the possibility of drawing conclusions about long-term learning outcomes or causal relationships between technology use and assessment practices. This observation reinforces the need for more robust and diversified research designs to strengthen the evidence base in this field. These findings should be interpreted as indicative of emerging patterns rather than definitive evidence of effectiveness.

Finally, the findings underscore that, despite the growing volume of research on digital technologies and assessment in higher education, there remains a lack of consolidated pedagogical and competence-based frameworks to systematically guide digitally mediated assessment practices. Without such references, the integration of digital technologies risks remaining superficial or primarily instrumental, limiting their transformative potential in assessment for learning.

On the whole, this scoping review emphasizes the importance of advancing research that integrates stronger pedagogical frameworks, more rigorous methodological approaches, and greater attention to contextual, ethical, and formative dimensions. Such efforts are essential to deepen understanding of how digital technologies can meaningfully support assessment for learning practices and contribute to more equitable and educationally grounded assessment processes in higher education.

Statements

Author contributions

MN-S: Funding acquisition, Investigation, Project administration, Supervision, Writing – review & editing. JG-M: Conceptualization, Investigation, Methodology, Validation, Writing – review & editing. YC-C: Data curation, Formal analysis, Software, Visualization, Writing – original draft, Writing – review & editing. TL-J: Conceptualization, Methodology, Resources, Writing – review & editing. CE-N: Conceptualization, Software, Writing – review & editing. CS-C: Conceptualization, Methodology, Visualization, Writing – original draft, Writing – review & editing.

Funding

The author(s) declared that financial support was received for this work and/or its publication. This study was funded by the Vicerrectoría de Investigación y Desarrollo of Universidad de Concepción through the VRID Multidisciplinary Project, code 2021000321MUL. The authors express their gratitude to the VRID fund for their support in the development of this research.

Conflict of interest

The author(s) declared that this work was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Generative AI statement

The author(s) declared that generative AI was used in the creation of this manuscript. During the preparation of this work the authors used generative AI in order to proofread the text for readability. After using this tool/service, the authors reviewed and edited the content as needed and take full responsibility for the content of the publication.

Any alternative text (alt text) provided alongside figures in this article has been generated by Frontiers with the support of artificial intelligence and reasonable efforts have been made to ensure accuracy, including review by the authors wherever possible. If you identify any issues, please contact us.

Publisher’s note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

Supplementary material

The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/feduc.2026.1744310/full#supplementary-material

References

  • 1

    Aguilera-HermidaA. P. (2020). College students’ use and acceptance of emergency online learning due to COVID-19. Int. J. Educ. Res. Open1, 100011. 10.1016/j.ijedro.2020.100011

  • 2

    Al-AzaweiA.ParslowP.LundqvistK. (2016). Barriers and opportunities of e-learning implementation in Iraq: a case of public universities. Int. Rev. Res. Open Distrib. Learn.17 (5), 126146. 10.19173/irrodl.v17i5.2501

  • 3

    Area-MoreiraM.Bethencourt-AguilarA.Martín-GómezS. (2023). Hyflex: enseñar y aprender de modo híbrido y flexible en la educación superior. Rev. Iberoam. Educ. Dist.26 (1), 141161. 10.5944/ried.26.1.34023

  • 4

    AromatarisE.LockwoodC.PorrittK.PillaB.JordanZ. (2024). JBI Manual for Evidence Synthesis. JBI. Adelaida: Joanna Briggs Institute (EditorIAL)Available online at:https://synthesismanual.jbi.global.https://doi.org/10.46658/JBIMES-24-01(Accessed January 22, 2026).

  • 5

    BlackP.WiliamD. (2009). Developing the theory of formative assessment. Educ. Assess. Eval. Account.21 (1), 531. 10.1007/s11092-008-9068-5

  • 6

    BondM.BedenlierS.MarínV. I.HändelM. (2021). Emergency remote teaching in higher education: mapping the first global online semester. Int. J. Educ. Technol. High. Educ.18, 124. 10.1186/s41239-021-00282-x

  • 7

    Buzzetto-HollywoodN.Thomas-BanksL. (2022). Impact of an unlimited e-book subscription service and digital learning solution in management education at a minority serving university. J. Inf. Technol. Educ.: Res.21, 597622. 10.28945/5036

  • 8

    Castro-GuzmánW. (2021). Challenges of professional development for technology integration in higher education. Cuad. Investig. Educ.12 (2), 8299. 10.18861/cied.2021.12.2.3090

  • 9

    CichaK.RuteckaP. (2023). “Digital tools for innovative higher education teaching - A scoping review of empirical studies,” in Information Systems Development, Organizational Aspects and Societal Trends (ISD2023 Proceedings), eds. da SilvaA. R.da SilvaM. M.EstimaJ.BarryC.LangM.LingerH.SchneiderC. (Lisbon, Portugal: Instituto Superior Técnico. ISBN: 978-989-33-5509-1. 10.62036/ISD.2023.22

  • 10

    CrawfordJ.Butler-HendersonK.RudolphJ.MalkawiB.GlowatzM.BurtonR.et al (2020). COVID-19: 20 countries’ higher education intra-period digital pedagogy responses. J. Appl. Learn. Teach.3 (1), 928. 10.37074/jalt.2020.3.1.7

  • 11

    CzerniewiczL.AgherdienN.BadenhorstJ.BelluigiD.ChambersT.ChiliM.et al (2020). A wake-up call: equity, inequality and COVID-19 emergency remote teaching and learning. Postdigit. Sci. Educ.2 (3), 946967. 10.1007/s42438-020-00187-4

  • 12

    DhawanS. (2020). Online learning: a panacea in the time of COVID-19 crisis. J. Educ. Technol. Syst.49 (1), 522. 10.1177/0047239520934018

  • 13

    Díaz VargasC.Sanhueza CamposC.Maluenda AlbornozJ. (2025). Digital tools and active methodologies for online EFL learning during emergency remote learning: students’ perceptions of an action research approach. Logos: Rev. Lingüíst. Filos. Lit.35, 1. 10.15443/RL3522

  • 14

    Emmanuel OlawaleB. (2024). “Inclusive innovations: promoting digital equity and inclusion through technological solutions,” in Education and Human Development, ed. X. Liu (Intechopen). 10.5772/intechopen.1005532

  • 15

    Escorcia-GuzmánJ. H.Zuluaga-OrtizR. A.Barrios-MirandaD. A.Delahoz- DominguezE. J. (2022). Information and communication technologies (ICT) in the processes of distribution and use of knowledge in higher education institutions (HEIs). Procedia. Comput. Sci.198, 644649. 10.1016/j.procs.2021.12.300

  • 16

    FraileJ.Ruiz-BravoP.Zamorano-SandeD.Orgaz-RincónD. (2021). Evaluación formativa, autorregulación, feedback y herramientas digitales: uso de socrative en educación superior [Formative assessment, self-regulation, feedback and digital tools: use of socrative in higher education]. Retos42, 724734. 10.47197/retos.v42i0.87067

  • 17

    García-PeñalvoF. J.CorellA.Abella-GarcíaV.Grande-de-PradoM. (2020). “Recommendations for mandatory online assessment in higher education during the COVID-19 pandemic,” in Radical Solutions for Education in a Crisis Context: COVID-19 as an Opportunity for Global Learning, eds. BurgosD.TliliA.TabaccoA. (Singapore: Springer), 8598.

  • 18

    GunshaJ. (2022). Use of digital tools in teaching methods – higher education learning. Int. J. Hum. Sci. Res.2 (35), 110. 10.22533/AT.ED.5582352201111

  • 19

    HernándezR. M.CáceresI. S.Zarate HermozaJ. R.CoronadoD. M.Loli PomaT. P.Arévalo GómezG. R. (2019). Information and communication technology (ICT) and its practice in educational evaluation. J. Educ. Psychol.-Propositos Represent.7 (2), 610. 10.20511/pyr2019.v7n2.328

  • 20

    IsupovaN. I.MamaevaE. A.MasharovaT. V.TsygankovaM. N. (2021). Formation of universal competencies of undergraduates during development of the plot of web-quest. Eur. J. Contemp. Educ.10 (4), 943957. 10.13187/ejced.2021.4.943

  • 21

    JainS.AlamM. A. (2022). Review of forthcoming ICT-enabled applications promoting learning in higher education. ICT with Intelligent Applications: Proceedings of ICTIS 2021, Volume 1. Springer, 613621.

  • 22

    KhajuriaR.SharmaA.SharmaA. (2023). A detailed survey regarding the usage of different ICT technology modes adopted by higher education institutions. Indones. J. Electr. Eng. Comput. Sci.29 (3), 16341641. 10.11591/ijeecs.v29.i3.pp1634-1641

  • 23

    KhanZ. R.SivasubramaniamS.AnandP.HysajA. (2021). e’-thinking teaching and assessment to uphold academic integrity: lessons learned from emergency distance learning. Int. J. Educ. Integr.17, 127. 10.1007/s40979-021-00079-5

  • 24

    KumarS. (2024). The effects of information and communication technology (ICT) on pedagogy and student learning outcome in higher education. EAI Endorsed Trans. Scalable Inf. Syst.11 (2), 15. 10.4108/eetsis.4629

  • 25

    López-NuñezJ.-A.Alonso-GarcíaS.Berral-OrtizB.Victoria-MaldonadoJ.-J. (2024). A systematic review of digital competence evaluation in higher education. Educ. Sci.14 (11), 1181. 10.3390/educsci14111181

  • 26

    MarcanoB.Ortega-RuipérezB.Castellanos-SánchezA. (2023). Percepción de docentes y estudiantes de educación superior de los exámenes a libro abierto y supervisados en la pandemia por COVID-19. Educación XX126 (1), 207228. 10.5944/educxx1.33514

  • 27

    MarinoniG.Van’t LandH.JensenT. (2020). The impact of COVID-19 on higher education around the world. IAU global Survey Report, 23 (1), 117. Accessed from Available online at:https://www.iau-aiu.net/Covid-19-Higher-Education-challenges-and-responses

  • 28

    MarínV. I.TurG.CastañedaL.Peguera-CarréM. C.OrellanaM. L.VillagráS.et al (2025). Agencia y aprendizaje en la educación superior: una revisión sistemática. UTE Teach. Technol.1, e4035. 10.17345/ute.2025.4035

  • 29

    Martínez-RomeraD. D.Cortés-DumontS. (2022). ICT And 360° evaluation: improving professional skills in higher education in Spain. Tuning J. High. Educ.10 (1), 89111. 10.18543/tjhe.2361

  • 30

    MayerR. E. (2024). The past, present, and future of the cognitive theory of multimedia learning. Educ. Psychol. Rev.36 (1), 125. 10.1007/s10648-023-09842-1

  • 31

    MeansB.NeislerJ. (2020). Suddenly online: A national survey of undergraduates during the COVID-19 pandemic. Digital Promise. Accessed from Available online at:https://digitalpromise.dspacedirect.org/server/api/core/bitstreams/8175a6f9-ed37-412f-a3d1-d1e37ffd2c9d/content

  • 32

    MerhiM. I.MeisamiA. (2024). Boosting students’ engagement with web-based assessment platforms: a self-determination theory perspective. AIS Trans. Hum. Comput. Interact.16, 216236. 10.17705/1thci.00205

  • 33

    Montenegro-RuedaM.Luque-de la RosaA.Sarasola Sánchez-SerranoJ. L.Fernández-CereroJ. (2021). Assessment in higher education during the COVID-19 pandemic: a systematic review. Sustainability13 (19), 10509. 10.3390/su131910509

  • 34

    NegiS. (2020). “Role of ICT in research and development in higher education,” in Role of ICT in Higher Education, eds. LatwalG. S.SharmaS. K.MahajanP.KommersP. (Palm Bay, FL: Apple Academic Press), 107111.

  • 35

    NolascoP.OjedaM. (2016). La evaluación de la integración de las TIC en la educación superior: fundamento para una metodología. Rev. Educ. Distancia (RED)48, 124. https://revistas.um.es/red/article/view/253511

  • 36

    OECD. (2023). OECD Digital Education Outlook 2023: Towards an Effective Digital Education Ecosystem. París: OECD Publishing10.1787/c74f03de-en

  • 37

    OldfieldA.BroadfootP.SutherlandR.TimmisS. (2012). Assessment in a Digital age: A Research Review. Bristol: University of Bristol.

  • 38

    Pérez-TorregrosaA. B.Gallego-ArrufatM. J.Cebrián-de-la-SernaM. (2022). Digital rubric-based assessment of oral presentation competence with technological resources for preservice teachers. Estud. Sobre Educ.43, 177198. 10.15581/004.43.009

  • 39

    PetersM. D. J.MarnieC.TriccoA. C.PollockD.MunnZ.AlexanderL.et al (2020). Updated methodological guidance for the conduct of scoping reviews. JBI Evid. Synthesis18 (10), 21192126. 10.11124/JBIES-20-00167

  • 40

    RomanC.DelgadoM. A.García-MoralesM. (2020). Using process simulators in chemical engineering education: is it possible to minimize the “black box” effect?Comput. Appl. Eng. Educ.28, 13691385. 10.1002/cae.22307

  • 41

    RomanC.DelgadoM. A.García-MoralesM. (2021). Socrative, a powerful digital tool for enriching the teaching–learning process and promoting interactive learning in chemistry and chemical engineering studies. Comput. Appl. Eng. Educ.29 (6), 15421553. 10.1002/cae.22408

  • 42

    SaldañaD.GonzálezL. (2022). La práctica pedagógica en educación superior: una mirada desde los actores de la carrera de educación inicial (UNAE- Ecuador). Rev. Estud. Exp. Educ.21 (46), 312327. 10.21703/0718-5162.v21.n46.2022.017

  • 43

    Sanz-BenitoI.Lázaro-CantabranaJ. L.Grimalt-ÁlvaroC.Usart-RodríguezM. (2023). Formar y evaluar competencias en educación superior: una experiencia sobre inclusión digital. Rev. Iberoam. Educ. Dist.26, 2. 10.5944/ried.26.2.35791

  • 44

    SelwynN. (2023). “The critique of digital education: time for a (post) critical turn,” in Rethinking Sociological Critique in Contemporary Education, eds. GorurR.LandriP.NormandR. (Abingdon-on-Thames, Oxfordshire: Routledge), 4862. 10.4324/9781003279457

  • 45

    Silva-QuirozJ.Canales-ReyesR.Garrido-MirandaJ. (2022). “Assessment of digital competencies in initial teacher training in Chile: what does the research say?,” in Digital Literacy for Teachers, eds. TomczykŁ.FedeliL. (Singapore: Springer Nature), 163189.

  • 46

    SunL.TangY.ZuoW. (2020). Coronavirus pushes education online. Nat. Mater.19, 687687. 10.1038/s41563-020-0678-8

  • 47

    TahirM. H. M.SaputraS.OthmanS.ShahD. S. M.SulaimanS. H.AzhariM. A.et al (2025). Online assessment in higher education: a systematic literature review. Multidiscip. Rev.9 (1), 110. 10.31893/multirev.2026024

  • 48

    TenórioT.BittencourtI. I.IsotaniS.SilvaA. P. (2016). Does peer assessment in on-line learning environments work? A systematic review of the literature. Comput. Human. Behav.64, 94107. 10.1016/j.chb.2016.06.020

  • 49

    Tirado-OlivaresS.Cózar-GutiérrezR.García-OlivaresR.González-CaleroJ. A. (2021). Active learning in history teaching in higher education: the effect of inquiry-based learning and a student response system-based formative assessment in teacher training. Australas. J. Educ. Technol.37 (5), 6176. 10.14742/ajet.7087

  • 50

    TriccoA. C.LillieE.ZarinW.O'BrienK. K.ColquhounH.LevacD.et al (2018). PRISMA Extension for scoping reviews (PRISMA-ScR): checklist and explanation. Ann. Intern. Med.169 (7), 467473. 10.7326/M18-0850

  • 51

    TuahN. A. A.NaingL. (2021). Is online assessment in higher education institutions during COVID-19 pandemic reliable?Siriraj Med. J.73 (1), 6168. 10.33192/Smj.2021.09

  • 52

    UpadhayayR.KaushikH.VermaK.MishraP. (2022). ICT In Indian higher education institutions: a review of challenges and opportunities during pandemic. 2022 10th International Conference on Reliability, Infocom Technologies and Optimization (Trends and Future Directions) (ICRITO). IEEE, 18.

  • 53

    VeletsianosG. (2020). Learning Online: The Student Experience. Baltimore, MD: JHU Press

  • 54

    WannapiroonN.PimdeeP. (2022). Thai Undergraduate science, technology, engineering, arts, and math (STEAM) creative thinking and innovation skill development: a conceptual model using a digital virtual classroom learning environment. Educ. Inf. Technol.27 (4), 56895716. 10.1007/s10639-021-10849-w

  • 55

    WatermeyerR.CrickT.KnightC.GoodallJ. (2021). COVID-19 and digital disruption in UK universities: afflictions and affordances of emergency online migration. High. Educ.81, 623641. 10.1007/s10734-020-00561-y

  • 56

    WiliamD. (2011). What is assessment for learning?Stud. Educ. Eval.37 (1), 314. 10.1016/j.stueduc.2011.03.001

  • 57

    ZhengY.LiH.ZhengT. (2018). Performance evaluation of ICT-based teaching and learning in higher education. Blended Learning. Enhancing Learning Success: 11th International Conference, ICBL 2018, Osaka, Japan, July 31-August 2, 2018, Proceedings 11. Springer International Publishing, 378390.

Summary

Keywords

assessment, digital technologies, higher education, scoping review, teaching

Citation

Núñez-Solís M, Garrido-Miranda JM, Chávez-Castillo Y, López-Jiménez T, Espinoza-Navarrete C and Sanhueza-Campos C (2026) Digital technologies in university assessment: a scoping review. Front. Educ. 11:1744310. doi: 10.3389/feduc.2026.1744310

Received

11 November 2025

Revised

29 March 2026

Accepted

30 March 2026

Published

24 April 2026

Volume

11 - 2026

Edited by

Aiedah Khalek, Monash University Malaysia, Malaysia

Reviewed by

Janus Van As, University of Witwatersrand, South Africa

Dr. Maryam Al-Hail, Community College of Qatar (CCQ), Qatar

Updates

Copyright

*Correspondence: Cristian Sanhueza-Campos

ORCID Tatiana López-Jiménez orcid.org/0000-0002-5870-6853 Cristhian Espinoza-Navarrete orcid.org/0000-0003-3451-7238

Disclaimer

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.

Outline

Figures

Cite article

Copy to clipboard


Export citation file


Share article

Article metrics