- Faculty of Education and Liberal Arts (FELA), INTI International University, Nilai, Malaysia
Many schools are using digital tools more frequently in day-to-day lessons, yet important national exams continue to be conducted on paper. This creates a practical tension: Students become familiar with digital work in class but are still required to perform in a paper-based setting during high-stakes assessments. This article introduces a small-scale change management initiative for schools that wish to explore a more sustainable way of integrating digital assessments. The focus is on urban secondary schools with students aged approximately 15 to 17 years. In developing the framework, Kotter’s 8-Step Model is used to organise the school-level process, while the Technology Acceptance Model (TAM) helps explain how teachers, students, and other staff members might respond to the transition. The model could later be examined in other urban settings to assess how far its ideas can be applied. The proposed pilot encourages schools to digitalise internal assessments, classroom tasks, and learning materials while maintaining readiness for paper-based national examinations through the use of occasional hybrid practices. To understand the effects of the pilot, the evaluation will draw on indicators such as student achievement, digital skill growth, levels of engagement, and environmental impact. These indicators can be supported by existing datasets, including national device-ownership figures and OECD reports. Overall, the framework aims to prepare schools gradually for future online and distance learning while maintaining stability in current examination performance.
1 Introduction
The rapid integration of digital technologies into education has transformed the way teaching, learning, and assessment are designed and delivered. In recent years, governments and educational institutions worldwide have embraced digitalisation as a means of improving access, enhancing efficiency, and aligning with the United Nations Sustainable Development Goals (SDGs), particularly SDG 4 on quality education and SDG 12 on responsible consumption (Anwar, 2023). Urban schools, often equipped with stronger infrastructure and higher technology adoption rates, are at the forefront of these initiatives. Digital platforms for homework submission, e-textbooks, and online formative assessments have become increasingly common, offering significant potential for reducing resource consumption and promoting environmentally sustainable practices (Pu et al., 2022).
However, in many education systems, high-stakes national examinations remain paper-based. This creates a potential mismatch between the digital-first environment of daily school life and the traditional format of final assessments (Bo, 2024). Students accustomed to typing essays and completing interactive online tasks may struggle to adapt to extended handwriting under examination conditions, while teachers must reconcile divergent assessment practices within their instructional planning. Moreover, fully digitalising internal assessments without accommodating the final examination format risks negatively affecting student performance, raising concerns among parents, educators, and policymakers about readiness for critical academic milestones.
Although research on digitalisation in education is growing, there is still limited focus on how to manage the transition from paper-based to digital systems at the school level. Existing literature tends to examine either the technical infrastructure or the pedagogical implications, with less attention given to the change management processes and the psychological responses of internal stakeholders such as teachers, students, and administrative staff (Howard et al., 2021; Teo, 2018). This gap is especially evident in the relatively privileged context of urban secondary education, where schools have the greatest opportunity to pilot effective and sustainable models for digitisation, yet they remain constrained by centrally mandated assessment policies.
To address this gap, this conceptual article proposes a potential pilot framework for sustainable digitalisation in urban secondary schools serving students aged 15–17 years. The framework also incorporates Kotter’s (2014) 8-Step Change Model to guide organisational procedures, alongside the Technology Acceptance Model (TAM) (Venkatesh and Davis, 2000), which facilitates user adoption rates and psycho-social preparedness. It outlines how internal assessments, homework, and learning resources could be digitalised while maintaining readiness for conventional paper-based national examinations through a hybrid assessment approach. Assessment criteria include educational outcomes, digital readiness, student engagement, and carbon footprint, drawing on secondary data from national statistics, OECD DigCompEdu indicators, and institutional reports.
Urban schools can adapt and expand this model to scale implementation before extending it to rural contexts, combining change management theory with user psychology. It offers valuable insights for school leaders, policymakers, and international education bodies navigating the intersection of technological innovation, academic readiness, and equitable access to examinations in a post-COVID-19 world, with a focus on long-term sustainability. Therefore, it aligns local digitalisation efforts with broader educational reform goals and global sustainability commitments.
2 Literature review
2.1 Digitalisation and sustainability in secondary education
Secondary education is widely regarded as a transformative lever for improving learning quality, resource efficiency, and environmental sustainability. Pu et al. (2022) found that replacing printed textbooks with e-textbooks and digitising homework submissions can save approximately one million sheets of paper per year; this directly helps achieve some SDGs (SDG 12—responsible consumption) (Anwar, 2023). Urban cities with more advanced infrastructure, reliable electricity, and robust internet connectivity have benefited the most, as there is minimal to no restriction in using digital learning platforms (Bo, 2024).
Recent empirical evidence suggests that digital assessment reform is likely to advance more rapidly within secondary education systems. This section synthesises findings from 2022 to 2025 to provide updated insights into digitalisation trends. Timotheou et al. (2022) and Pedaste et al. (2023) validated trend-competence measurement instruments for youth by integrating indicators of 21st-century skills into standard educational practices. These findings collectively illustrate that effective digitalisation depends not only on access to devices but also on systemic capacity development and institutional coherence.
However, not all schools benefit equally from digitalisation. In general, the adoption of technological tools by schoolteachers remains limited. In specific areas, such as rural education, the situation tends to be worse due to low connectivity, poor hardware, and the lack of trained teachers for this new teaching and learning environment. These disparities underscore the need for gradual implementation strategies, with resource-rich urban schools serving as sites for pilot initiatives (Howard et al., 2021). Moreover, although media reports often highlight the environmental benefits of digitalisation, few studies have examined the longitudinal effect of school digitalisation on student performance in high-stakes examinations. The application of real-life transitions to e-examinations has also been recently assessed. Domínguez-Figaredo et al. (2025) found that students’ adaptation from paper-based to online examinations can lead to increased confidence after several trials. Similarly, Bang and Lee (2023) found that national qualification pilots were as reliable as, and more efficient than, traditional paper-based tests. These results show that well-designed digital examination tools can support assessment validity and student performance while reducing logistical demands. These studies collectively highlight the need for an integrated organisational–psychological framework to guide sustainable digitalisation.
2.2 Educational change management in school reform
Fullan states that “wise and successful implementation of digital transformation in school systems is not only a matter of technology, but also an organisational learning challenge” (Fullan, 2025, 2). Actual change management models offer structured pathways for how an organisation might transition from old systems to new ones (e.g., Kotter, 2014, 8-Step Change Model). Research indicates that schools adopting systematic change management approaches have a safer and more effective method of integrating technology within their systems (Burnes, 2019).
Kotter’s model is based on establishing a sense of urgency, building coalitions, and envisioning and institutionalising change within the organisational culture. Regarding digitalisation in schools, this involves providing information and communication about the benefits of the change, encouraging participative decision-making, and offering an ongoing professional development program to build capacity within the teaching workforce (Fullan, 2025). Schools that do not adopt structured change processes may face resistance from staff, and the reform will suffer from poor implementation, which ultimately impacts its sustainability.
In line with global studies on teacher practice, research also emphasises the role of teacher practice and feedback culture in the reform of digital assessments. Miliou et al. (2023) reported that incorporating reflection through digital self-assessment tools successfully enhances students’ 21st-century skills when teachers scaffold the application to provide feedback in a format that is both formative and effective. Viberg et al. (2024) discovered that, although teachers identified the pedagogical affordances of digital assessment, their confidence and institutional support were inconsistent. These results support Fullan’s (2025) claim that professional development and making decisions as a group are essential for achieving sustainable change.
2.3 Technology adoption and user psychology
According to Chen and Popovich (2003) and Venkatesh and Davis (2000), the willingness of internal stakeholders to adopt new systems is a critical factor influencing the success or failure of organisational digitalisation and transformation initiatives, particularly in relation to leadership readiness and supporting infrastructure. The Technology Acceptance Model (TAM; Figure 1) proposes that adoption behaviour is essentially determined by two beliefs: Perceived usefulness and perceived ease of use. Based on the UTAUT model, teachers and students are more likely to use digital systems if they believe that the system enhances learning outcomes and brings ease to their daily practices (Teo, 2018).
Figure 1. Integrated change management and technology acceptance framework for school digitalisation (source: adapted from Kotter, 2014 and Venkatesh and Davis, 2000).
Recent studies report that concerns about digital adoption are often linked to increased work pressure, traditional teaching roles, limited digital literacy, and anxiety over technology replacing conventional pedagogies (Howard et al., 2021). However, training designed to address these psychological barriers, ongoing support, and early involvement in the development of pilot initiatives can improve adoption rates (Bui and Nguyen, 2023). The TAM offers two key points of reflection for secondary schools with relatively high levels of digital literacy, as is often the case in urban areas: How to make technology use more efficient and sustainable.
2.4 Balancing digital learning with traditional examination readiness
Hybrid assessment models, which combine digital assessments with paper-based testing, have been regarded as a practical compromise for schools transitioning to digitalisation, particularly under the constraints of a largely paper-based national examination system (Graham et al., 2015). This helps ensure that students develop proficiency in using digital tools for learning without losing effectiveness in handwriting speed and examination stamina.
When secondary schools implemented hybrid assessment models, evidence strongly suggested that conducting regular paper-based mock exams helped mitigate potential declines in performance on final national assessments (Bui and Nguyen, 2023). This is especially important in contexts where high-stakes examinations are not yet designed to capture the achievements of students performing at Level C2, assessing dual competencies not only in digital formats but also in traditional formats. It must be ensured that internal assessment rubrics are aligned with the national examination marking criteria to prevent disadvantaging students during the transition.
Overall, recent peer-reviewed studies from 2022 to 2025 provide contemporary evidence on system readiness, classroom adoption, and instrument-design innovation, which collectively inform the conceptual framework proposed in this study.
3 Theoretical framework
3.1 Overview
The development of the proposed urban secondary school digitalisation pilot draws on two synergistic theoretical frameworks: Kotter’s 8-Step Change Model (Kotter, 2014) and the Technology Acceptance Model (TAM) (Venkatesh and Davis, 2000). Kotter (1996) offers a structured, phased approach to managing organisational change, while the TAM explains how individual perceptions influence the acceptance and use of new technologies. This study combines these frameworks to focus on organisational change processes and the practical use of digital technologies, while also addressing the psychological acceptance of school digitalisation in ways that promote successful and sustainable outcomes.
3.2 Kotter’s 8-step change model in school digitalisation
Kotter’s 8-Step Change Model provides a structured approach for implementing change in complex organisations, such as schools. The model comprises eight sequential steps: (1) create urgency, (2) build a guiding coalition, (3) form a strategic vision, (4) communicate the vision, (5) empower broad-based action, (6) generate short-term wins, (7) consolidate gains, and (8) anchor new approaches in the organisational culture. Taken together, these steps are particularly relevant when transitioning from pen-and-paper methods to digital systems.
In the context of this pilot:
• Creating urgency involves presenting compelling evidence, such as statistics from the United Nations Educational, Scientific and Cultural Organization (UNESCO) on the environmental impact of paper use in schools, to raise stakeholder awareness.
• Building a guiding coalition requires assembling a leadership team that includes administrators, technology coordinators, and teacher representatives.
• Forming and communicating the vision ensures that all internal stakeholders understand the purpose of the transition, aligning with Sustainable Development Goal (SDG) 4 on quality education and SDG 12 on responsible consumption.
• Empowering action includes providing training sessions and ensuring access to the necessary technology.
• Generating short-term wins could involve celebrating the successful completion of the first digital mock exam.
Consolidating gains ensures that momentum continues, while anchoring the culture change solidifies digital assessment as a standard practice.
3.3 Technology Acceptance Model (TAM) for user psychology
The Technology Acceptance Model, developed by Venkatesh and Davis (2000), explains technology adoption through two primary determinants: Perceived usefulness and perceived ease of use. In educational contexts, these determinants strongly influence the attitudes and behavioural intentions of both teachers and students towards new digital systems (Teo, 2018).
In this pilot design:
• Perceived usefulness may be enhanced by demonstrating how digital assessments streamline marking, enable faster feedback, and support environmental sustainability.
• Perceived ease of use can be addressed through user-friendly platforms, technical support, and practice sessions before official implementation.
• Positive perceptions lead to stronger attitudes towards using the system, which, in turn, influence behavioural intentions and ultimately the actual adoption of the system.
Addressing these variables early in the change process reduces resistance and fosters sustained engagement with new digital practices (Howard et al., 2021).
3.4 Integrated framework
The integration of Kotter’s process-oriented model with the TAM’s user-centred perspective ensures that both systemic change management and individual adoption behaviours are addressed simultaneously. For example, Step 4 of Kotter’s model (communicate the vision) can be strategically aligned with enhancing perceived usefulness in the TAM by highlighting the concrete benefits of digitalisation for teaching and learning. Similarly, Step 5 (empower action) can align with increasing perceived ease of use through targeted training and resource provision.
This dual-framework approach is expected to maximise the success of the pilot by:
1. Guiding leadership actions through a proven change management process.
2. Supporting teachers, students, and administrative staff in developing positive attitudes towards digitalisation.
3. Embedding the change into school culture while maintaining readiness for traditional high-stakes examinations.
The integrated framework is summarised in Figure 1, which illustrates how Kotter’s 8-Step Change Model (blue) aligns with the Technology Acceptance Model (green) to connect organisational change processes with user adoption behaviours in the proposed pilot.
Recent methodological research by Van Laar et al. (2024) on performance-based digital skills assessments for adolescents further supports the construct validity principles embedded in the present framework, illustrating how assessment instrument design can complement organisational change models.
3.5 Diagram description and interpretation
3.5.1 Title
Integrated Change Management and Technology Acceptance Framework for School Digitalisation.
The figure shows the sequential alignment of Kotter’s 8-Step Change Model (blue) with the TAM’s user-centred constructs (green), highlighting how specific change actions influence perceptions, attitudes, and the actual use of digital technologies in school digitalisation.
3.5.2 Structure
• Top horizontal flow (Kotter’s 8 Steps): eight labelled boxes connected sequentially, from “Create Urgency” to “Anchor New Approaches in Culture”.
• Bottom horizontal flow (TAM): beneath each relevant Kotter step, smaller boxes show TAM variables: “Perceived Usefulness,” “Perceived Ease of Use,” “Attitude,” “Behavioural Intention,” and “Actual Use”.
• Vertical connectors: lines from the specific Kotter steps to the related TAM variables (e.g., Step 4 “Communicate the Vision” → “Perceived Usefulness”; Step 5 “Empower Action” → “Perceived Ease of Use”).
• Context labels: on the left, “Organisational Process”; on the right, “User Psychology”.
This visual clearly illustrates the interaction between the two models, reinforcing the idea that organisational change actions can influence individual acceptance and adoption behaviours.
4 Proposed pilot design
This section presents the proposed pilot design and the conceptual framework for implementing digital transformation in schools. This study adopted a conceptual, literature-based design rather than an empirical methodology. No primary data were collected, and statistical analyses were therefore not applicable. Instead, the framework was derived through synthesis and theory integration across peer-reviewed literature.
4.1 Target context and participants
This pilot targets a well-resourced urban secondary school with stable broadband connectivity and device access. The student population comprises upper-secondary students aged 15–17 years across mixed-ability classes. Participating staff include subject teachers (English, Science, and Math), a digital lead, and exam office personnel. An ethics and safeguarding protocol is assumed for any data that may be collected.
4.2 Scope of digitalisation
• Learning resources: e-textbooks, LMS-hosted notes, and OER repositories.
• Assignments/homework: LMS submissions with plagiarism + AI-use declarations.
• Internal assessments: term quizzes, mid-term exams, and mock assessments delivered on secure, on-site devices; accommodations provided as needed (e.g., font size, screen readers).
• Records and reporting: digital gradebook, analytics dashboards, and automated feedback windows.
4.3 Hybrid assessment to preserve national examination readiness
• Termly paper-based full mock examinations: administered in the format of the national examination, including timings, booklet structure, and marking rubrics.
• Weekly digital formative checks: activities such as typing essays, completing item banks, and responding to auto-marked items.
• Bridging mini-tasks: exercises that follow the sequence: planning on paper → typing → printout → annotate by hand (supports transfer of skills both ways).
4.4 Implementation phases (mapped to Kotter + TAM)
• Phase 0: unfreeze/readiness (Weeks 1–4) — urgency communications; baseline surveys for perceived usefulness/ease of use; device audit; small demonstrations.
• Phase 1: limited roll-out (Weeks 5–12) — Two subjects go digital for homework + one secure digital quiz; intensive helpdesk; celebrate first “quick wins”.
• Phase 2: Core roll-out (Term 2) — expand to five to six subjects; first digital mid-term; conduct one paper mock; targeted PD for sceptical staff.
• Phase 3: consolidation (Term 3) — majority of internal exams digital; second paper mock; policy updates; codify workflows; student study-skills sessions for paper stamina.
• Phase 4: anchor and Scale (Term 4) — refine SOPs; cost/impact report; rural adaptation plan; staff succession plan for continuity.
4.5 Safeguards and security
• Locked-down browsers; local server caching; daily backups; offline contingency packs; invigilation rules; AI-use disclosure forms; proctoring limited to in-school privacy standards.
5 Managing internal user psychology (TAM-informed)
5.1 Teachers
• Barriers: Workload anxiety, loss of control, and fear of technical failure.
• Supports: Co-planning time, templated rubrics, one-click LMS workflows, peer coaching, and visible administrative backing.
• Messaging (perceived usefulness): Faster feedback, data to differentiate, and reduced marking load for objective items.
• Messaging (perceived ease of use): Step-by-step checklists, sandbox practice, and on-call tech buddies.
5.2 Students
• Barriers: typing speed limitation, screen fatigue, and concern about glitches affecting grades.
• Supports: touch-typing warm-ups, timed digital → paper drills, device loaners, and a clear appeals process for disruptions.
• Motivation hooks: instant feedback, goal trackers, and micro-badges for revision streaks.
5.3 Administrative staff
• Barriers: role redefinition and data-protection burden.
• Supports: SOPs for exam creation/export, incident logs, basic cybersecurity training, and a rotating schedule for helpdesk coverage.
6 Evaluation framework
To further enhance the application of the model, future research could adopt a mixed-methods approach that combines qualitative and quantitative evaluations. Qualitative methods such as semi-structured interviews, focus group discussions, and classroom observation could help us understand teachers’ and students’ experiences of readiness, engagement with digital assessment practices, and perceived barriers to implementation. Formal evaluations, such as survey-based measures and statistical comparisons of web-based versus paper-based outcomes, offer the advantage of objectively assessing learning effectiveness and behaviour in action. The integration of the two types of analysis will broaden our understanding of whether, and how, we can apply the model across various educational settings.
6.1 Outcomes and indicators
• Academic: internal examination means/SD versus a matched control school; performance trends in paper-based mock examinations.
• Engagement: LMS login frequency, on-time submissions, and survey scale scores (motivation, self-efficacy).
• Digital skills: typing speed (WPM), file management tasks, and plagiarism-free submission rates.
• Sustainability: number of paper sheets saved, printing costs avoided, and estimated CO₂e reduction.
• Acceptance (TAM): perceived usefulness/ease of use (Likert), intention to use, and open-ended sentiments.
6.2 Study design (conceptual)
• Comparison: a pilot school versus a similar urban non-pilot school (or a pre–post comparison within the pilot school).
• Data sources: LMS analytics, examination office records, brief student/teacher surveys, facilities printing logs, and ICT helpdesk tickets.
• Analysis ideas:
• Descriptives + effect sizes (Cohen’s d) for achievement deltas.
• Trend lines for paper usage and late submissions.
• Cross-tabs: acceptance levels × submission punctuality.
• Thematic coding of open responses (barriers, enablers).
6.3 Success criteria
• No decline in paper-based mock examination scores compared to baseline performance.
• A ≥ 15–30% reduction in paper usage by Term 3.
• Meaningful improvements in perceived usefulness/ease of use.
• Reduced teacher marking time for objective components.
7 Policy and practice implications
• Urban-first, rural-aware: publish an adaptation kit, including offline caching strategies, device pools, and community Wi-Fi hubs.
• Procurement and training: emphasise that the total cost of ownership for digital solutions can outperform ad-hoc printing; implement micro-credential PD tracks.
• Assessment alignment: engage with the ministry on long-term e-assessment plans; in the interim, standardise hybrid rubrics.
• Equity: Implement device-loan policies, ensure accessibility defaults, and provide transparent accommodations.
8 Conclusion
This pilot offers a scalable, evidence-informed pathway for digitalising internal assessments while maintaining readiness for paper-based national examinations. By combining Kotter’s process with the TAM’s psychology, schools can implement digital practices efficiently while maintaining staff and student buy-in. The evaluation plan allows leaders to track learning, sustainability, and acceptance outcomes, informing policy decisions and supporting future ODL readiness. Key pilot milestones, associated evidence indicators, potential risks, and mitigation strategies are presented in Table 1, providing a concise operational overview of the proposed framework. In addition, Table 2 summarises the evaluation matrix, linking each outcome domain to measurable indicators, data sources, and performance targets and providing an operational guide for assessing implementation effectiveness.
Data availability statement
The original contributions presented in the study are included in the article/supplementary material, further inquiries can be directed to the corresponding author.
Author contributions
WC: Conceptualization, Writing – original draft, Writing – review & editing.
Funding
The author(s) declared that financial support was not received for this work and/or its publication.
Conflict of interest
The author(s) declared that this work was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Generative AI statement
The author(s) declared that Generative AI was used in the creation of this manuscript. The author acknowledge the use of Grammarly for language refinement and Mermaid for generating schematic visualisations (Figure 1). These tools were used solely for linguistic editing and diagram rendering. All intellectual content, conceptual design, and interpretations are the authors’ own.
Any alternative text (alt text) provided alongside figures in this article has been generated by Frontiers with the support of artificial intelligence and reasonable efforts have been made to ensure accuracy, including review by the authors wherever possible. If you identify any issues, please contact us.
Publisher’s note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
References
Anwar, N. (2023). Report review: reimagining our futures together: a new social contract for educationReport review: reimagining our futures together: a new social contract for education, edited by Mary de Sousa, Paris, United Nations Educational, Scientific and Cultural Organization (UNESCO), 2021, 186 pp., ISBN 978-92-3-100478-0. Global. Soc. Educ. 23, 1–3. doi: 10.1080/14767724.2023.2269103
Bang, M., and Lee, Y. (2023). Pilot study on the digitalization of the national qualification exam for Korean engineers. Educ. Inf. Technol. 29, 21–50. doi: 10.1007/s10639-023-12279-2
Bo, N. S. W. (2024). OECD digital education outlook 2023: towards an effective education ecosystem. Hung. Educ. Res. J. doi: 10.1556/063.2024.00340
Bui, T. T., and Nguyen, T. S. (2023). The survey of digital transformation in education: a systematic review. Int. J. TESOL Educ. 3, 32–51. doi: 10.54855/ijte.23343
Burnes, B. (2019). The origins of Lewin’s three-step model of change. J. Appl. Behav. Sci. 56, 32–59. doi: 10.1177/0021886319892685
Chen, M., and Popovich, K. (2003). Understanding customer relationship management (CRM): People, process and technology. Business Proc. Manag. J 9, 672–688. doi: 10.1108/14637150310496758
Graham, S., Harris, K. R., and Santangelo, T. (2015). Research-based writing practices and the common core. Elem. Sch. J. 115, 498–522. doi: 10.1086/681964
Howard, S. K., Tondeur, J., Ma, J., and Yang, J. (2021). What to teach? Strategies for developing digital competency in preservice teacher training. Comput. Educ. 165:104149. doi: 10.1016/j.compedu.2021.104149
Kotter, J. P. (2014). Accelerate: Building strategic agility for a faster-moving world. Boston, MA: Harvard Business Review Press.
Miliou, O., Adamou, M., Mavri, A., and Ioannou, A. (2023). An exploratory case study of the use of a digital self-assessment tool of 21st-century skills in makerspace contexts. Educ. Technol. Res. Dev. 72, 239–260. doi: 10.1007/s11423-023-10314-0
Pedaste, M., Kallas, K., and Baucal, A. (2023). Digital competence test for learning in schools: development of items and scales. Comput. Educ. 203:104830. doi: 10.1016/j.compedu.2023.104830
Pu, R., Tanamee, D., and Jiang, S. (2022). Digitalization and higher education for sustainable development in the context of the COVID-19 pandemic: a content analysis approach. Probl. Perspect. Manage. 20, 27–40. doi: 10.21511/ppm.20(1).2022.03
Teo, T. (2018). Students and teachers’ intention to use technology: assessing their measurement equivalence and structural invariance. J. Educ. Comput. Res. 57, 201–225. doi: 10.1177/0735633117749430
Timotheou, S., Miliou, O., Dimitriadis, Y., Sobrino, S. V., Giannoutsou, N., Cachia, R., et al. (2022). Impacts of digital technologies on education and factors influencing schools’ digital capacity and transformation: a literature review. Educ. Inf. Technol. 28, 6695–6726. doi: 10.1007/s10639-022-11431-8,
Van Laar, E., Van Deursen, A. J. a. M., Helsper, E. J., and Schneider, L. S. (2024). Developing performance tests to measure digital skills: lessons learned from a cross-national perspective. Media Commun. 13. doi: 10.17645/mac.8988
Venkatesh, V., and Davis, F. D. (2000). A theoretical extension of the technology acceptance model: four longitudinal field studies. Manag. Sci. 46, 186–204. doi: 10.1287/mnsc.46.2.186.11926
Keywords: educational change management, hybrid assessment, open and distance learning readiness, secondary education digitalisation, sustainable development goals, technology acceptance model
Citation: Chi WJ (2026) Designing a pilot model for fully digitalised internal examinations in urban secondary schools: a sustainable transition framework. Front. Educ. 11:1684055. doi: 10.3389/feduc.2026.1684055
Edited by:
Mevlut Aydogmus, Necmettin Erbakan University, TürkiyeReviewed by:
LuisArturo Avila-Meléndez, Instituto Politecnico Nacional Centro Interdisciplinario de Investigacion para el Desarrollo Integral Regional Unidad Michoacan, MexicoAlejandro Higuera Zimbron, Universidad Autónoma del Estado de México, Mexico
Copyright © 2026 Chi. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Wen Jye Chi, aTI0MDI5OTQ0QHN0dWRlbnQubmV3aW50aS5lZHUubXk=