- 1Department of Medicine, College of Medicine, Umm Al-Qura University, Makkah, Saudi Arabia
- 2Department of Community Medicine and Pilgrims Healthcare, Umm Al-Qura University (UQU), Makkah, Saudi Arabia
Background and objective: Entrustable professional activities (EPAs) are tasks that medical professionals can be entrusted to perform in unsupervised settings once they have achieved sufficient specific competencies. Despite their importance, nationally validated neurology EPAs are lacking in Saudi Arabia. This study aimed to develop and validate neurology EPAs in Saudi Arabia.
Methods: A list of neurology EPAs was developed after an extensive review of existing neurology training program competencies and EPAs. Neurology experts were invited to participate in two rounds of a modified Delphi technique to review the list of EPAs and assess their relevance and representativeness using a 5-point Likert scale. In total, 21 neurologists participated in the study. Descriptive statistics were used to describe participants’ demographic characteristics and group responses to each EPA. Cronbach’s alpha was used to assess the internal consistency of the responses across both Delphi rounds.
Results: In the first round, 26 EPAs were validated, 10 were excluded owing to a lack of relevance, and one EPA was added as a modification of the existing items. In the second round, one more EPA was excluded because of a lack of relevance, resulting in a final set of 25 neurology EPAs.
Conclusion: This study developed and content-validated a set of EPAs for neurology residency training in Saudi Arabia. It represents an initial step toward implementing an EPA-based curriculum. Further steps are necessary to ensure adequate integration into training programs.
1 Introduction
Entrustable professional activities (EPAs) are tasks that medical professionals can be entrusted to perform in unsupervised settings once they have achieved sufficient specific competencies (1). EPAs integrate multiple competencies or milestones in real clinical settings and reflect tasks essential for safe and effective practice (2). These activities result from the accumulation of knowledge and skills related to specific specialties. Since their introduction in 2005 (3), EPAs have increasingly been used to define healthcare curricula (4), and EPA-based curricula have been implemented across different stages of medical education (5–12).
In recent years, several specialties, such as internal medicine, anesthesiology, psychiatry, pediatrics, orthopedics, physical medicine, and rehabilitation, have established EPA frameworks to better align training outcomes with real-world clinical practice (6, 8, 13–22). In 2020, the Royal College of Physicians and Surgeons of Canada adopted neurology EPAs within a competency-based framework (22). Furthermore, a Singaporean group developed five neurology EPAs for workplace assessment (23), although the results of their study have not yet been published. Despite these advances, structured EPA development in neurology has received relatively limited attention, particularly in Saudi Arabia.
The neurology residency program in Saudi Arabia was established in 1999 under the Saudi Commission for Health Specialties (SCFHS). Initially designed as a 4-year program, it was extended to 5 years in 2016 to better accommodate evolving training needs (24). The program aims to graduate competent clinical neurologists, with core aspects of training including interpersonal and communication skills, professional development, ethical conduct, integration of evidence-based medicine into clinical practice, and quality assurance. The program is competency-based, with specific competencies defined according to the level of training and clinical rotations (24).
Competency-based frameworks provide valuable guidance for learners, supervisors, and institutions regarding teaching and assessment (25). Despite the emphasis on competencies, their integration into the clinical practice of graduating physicians remains limited (25). Competencies describe the ability of the individual to perform specific tasks, whereas EPAs specify the professional activities that can be entrusted to the trainee (25).
Given the variation in healthcare systems and educational structures across countries, adopting EPA models developed elsewhere may not align with local practices. Therefore, developing EPAs specific to national or regional contexts is essential. A large project aimed at developing and validating EPAs across several internal medicine fellowship programs in Saudi Arabia is underway, with a few publications available to date (17–19). As part of this project, the present study aimed to develop and validate end-of-training EPAs for neurology in Saudi Arabia.
2 Materials and methods
This study aimed to develop and validate a consensus-based list of EPAs for neurology residency programs in Saudi Arabia, using a modified Delphi technique. The study was conducted in two phases: a preparatory phase involving expert review and EPA generation, followed by two rounds of a modified Delphi technique.
2.1 Standard protocol approvals, registrations, and patient consents
Ethical approval was obtained from the Institutional Review Board of Umm Al-Qura University, Makkah, Saudi Arabia (IRB approval number HAPO-02-K-012-2024-04-2098). The study complies with the Declaration of Helsinki. Informed consent was obtained from all participants. The study design and flow are illustrated in Figure 1.
The target participants comprised board-certified neurologists practicing in Saudi Arabia with a minimum of 3 years of clinical experience and active involvement in postgraduate neurology training programs; those with exactly 3 years of experience were also eligible if they had at least 2 years of active engagement as a trainer in postgraduate training. Participants were recruited using purposive sampling to ensure diversity across institutional affiliations and geographic regions. In total, 45 expert neurologists were invited, each receiving an orientation package outlining the study’s objectives, methodology, and expectations.
2.2 Phase 1: development of preliminary EPA list
In the preliminary phase, a team of two senior neurologists and two medical education experts conducted a thorough review of national and international neurology training competencies, including the SCFHS neurology training objectives, Royal College of Physicians and Surgeons of Canada neurology residency competencies and related EPAs, and the Neurology Core Competencies Outline of the American Board of Psychiatry & Neurology (22, 24, 26, 27). The senior neurologist had more than 10 years of experience and active participation in postgraduate training. Based on this review, a list of 36 preliminary neurology EPAs was generated and categorized into four domains: clinical assessment, clinical management, procedures, and transferable skills. This list served as the foundation for the modified Delphi rounds.
2.3 Phase 2: Delphi rounds
A two-round modified Delphi technique was employed to refine and validate the proposed EPAs. The Delphi panel included experts who met the eligibility criteria and agreed to participate. Demographic data, including years of experience in neurology and training, were collected prior to the first round.
In the first round, participants rated each of the 36 preliminary EPAs for relevance and representativeness on a 5-point Likert scale (1 = not important/relevant, 5 = very important/relevant). In addition to quantitative ratings, participants were invited to provide written qualitative feedback on each EPA. These comments were systematically reviewed by the research team to identify ambiguities, contextual limitations, or suggestions for modification. EPAs that did not reach consensus but received constructive feedback were revised and rephrased for reconsideration in the second round rather than being excluded outright. This combined analysis of quantitative ratings and qualitative feedback ensured that the final EPA list reflected both statistical consensus and expert insights.
Based on the results of the first round, 10 EPAs with a mean score <4.0 or <80% agreement (defined as the proportion of panelists rating the EPA as 4 or 5) were excluded, and one EPA was modified based on expert comments. The revised list was then carried forward to the second round.
In the second round, the same panel of experts reviewed the revised list of EPAs and reassessed their relevance and representativeness using the same Likert scale. The aim of this round was to confirm consensus on retained EPAs and to assess agreement with the modifications made following the first round. Panelists were given the opportunity to provide further comments on the pre-final list and were asked to indicate their agreement with the introduced modifications.
2.4 Data analysis
Quantitative data from each Delphi round were analyzed using descriptive statistics, including the mean, standard deviation, and percentage agreement. Consensus was defined as a mean rating ≥4.0, with at least 80% of panelists rating the EPA as 4 or 5. Although Likert data are ordinal, using mean scores and percentage agreement thresholds is widely accepted in Delphi studies, particularly in EPA development research, to operationalize consensus and enable direct comparison across rounds (17–20). This approach was therefore adopted to maintain methodological consistency with previous national Delphi studies and to facilitate interpretation of results.
EPAs that did not meet the consensus threshold were either excluded or reconsidered for modification if supported by constructive qualitative feedback. Qualitative feedback provided by panelists in free-text form was analyzed thematically by four authors, who reviewed and coded comments to identify ambiguities, contextual limitations, or suggestions for modification. These insights were then integrated with the quantitative results to guide decisions about revising or excluding EPAs between rounds.
Internal consistency of panel ratings was assessed using Cronbach’s alpha with 95% confidence intervals, a widely used reliability index in Delphi studies. Although interrater reliability indices, such as the intraclass correlation coefficient, can provide additional insights, they were not applied here, as the focus was on consistency of ratings across items rather than agreement between raters.
3 Results
3.1 Preliminary phase: development of EPA list
The draft EPA list was reviewed internally and finalized for evaluation in the Delphi rounds.
3.2 Second phase
Of the 45 invited experts, 21 (46.7%) completed both rounds of the Delphi survey. Most were from the Western region of Saudi Arabia (62%), with over 71% having more than 10 years of experience in neurology. However, 61.9% had fewer than 2 years of experience as residency program trainers, reflecting the recent expansion of the neurology program and the recruitment of new faculty members. The demographic characteristics of the participants are summarized in Table 1.
Table 1. Demographic characteristics of participants in the first round of the modified Delphi technique.
EPAs that met the predefined consensus threshold (mean score of ≥4.0, ≥80% agreement) were retained, while those that did not were either excluded or revised based on qualitative feedback. Thematic analysis of panelists’ comments identified three main themes guiding EPA revisions between rounds: (1) clarifying the scope of each EPA to align with general neurologist competencies (e.g., excluding subspecialty tasks such as muscle biopsy interpretation), (2) considering variability in resources and practice settings (e.g., access to genetic testing), and (3) distinguishing neurologist roles from allied health tasks (e.g., preventive counseling).
Consequently, 25 EPAs (69%) achieved consensus and were retained, 11 were excluded owing to insufficient relevance or agreement, and one was revised based on expert comments. Detailed ratings, standard deviations, and levels of disagreement are presented in Table 2.
The updated list of 26 EPAs was carried forward to the second round, which all 21 participants completed. Strong agreement was achieved across all EPAs (mean rating of ≥4 and ≥80% of panelists in agreement), resulting in a final set of 25 EPAs (Table 3).
Decisions were guided by both predefined quantitative criteria and qualitative input, ensuring a transparent audit trail. For example, one EPA initially combined electroencephalography (EEG), nerve conduction studies (NCS), electromyography (EMG), and muscle biopsy interpretation; removal of muscle biopsy interpretation, a subspecialty skill, improved consensus in round two.
3.3 Reliability assessment
The reliability of responses across both modified Delphi rounds was evaluated using Cronbach’s alpha. Cronbach’s alpha values were 0.95 in Round 1 and 0.91 in Round 2, indicating excellent internal consistency of expert ratings. These findings support the reliability and consistency of consensus decisions across both rounds.
4 Discussion
This study successfully developed and validated a set of 25 EPAs tailored to neurology residency programs in Saudi Arabia. The process followed a structured methodology involving expert consensus, using a modified Delphi technique to ensure content validity, contextual relevance, and alignment with international standards.
The initial development of 36 preliminary EPAs was grounded in a comprehensive review of existing neurology curricula from the SCFHS, Royal College of Physicians and Surgeons of Canada, and American Board of Psychiatry and Neurology. The categorization of EPAs into clinical assessment, clinical management, procedures, and transferable skills mirrors the core competencies expected of neurology residents globally, while adapting them to the Saudi training context.
Although most participating experts had extensive clinical experience, many had limited experience in supervising residency training. A modified Delphi method was employed, involving two survey rounds. While 71% of respondents had over 10 years of experience, 61.9% had fewer than 2 years of experience in residency training supervision. This likely reflects the recent expansion of neurology programs and the recruitment of new faculty members across Saudi Arabia. Nevertheless, the panel’s demographic diversity enriched the consensus process by ensuring a wide range of clinical and educational perspectives.
Several EPAs were excluded during the Delphi rounds because of perceived irrelevance or variability in practice across institutions, highlighting differences in neurology practice patterns within Saudi Arabia. One such example is EPA A9 (Table 2), related to a genetic disorder. Although recognition of the genetic background of many neurological disorders is increasing (28–30), genetic testing is limited to highly specialized centers. Our preliminary EPAs included the ability to perform appropriate genetic testing for neurological disorders. However, most adult neurology graduates are expected to practice in primary or secondary care centers, where such testing is typically unavailable. In routine clinical practice, patients with suspected genetic disorders are typically referred to specialized centers. Consequently, this EPA did not achieve a score of 4 owing to high variability in responses and was excluded from the pre-final list.
One modified EPA, originally covering the interpretation of EEG, NCS, EMG, and muscle biopsy (EPA A10), received mixed consensus. While experts agreed on the importance of EEG, NCS, and EMG for general neurologists, they considered muscle biopsy interpretation as a subspecialty task. Accordingly, the EPA was revised to exclude muscle biopsies. This modification improved consensus in the second round, achieving a mean score of 4.143 (SD = 0.655) and 86% agreement.
The distinction between reading and interpreting diagnostic tests has emerged as a key point. Although full interpretation may require subspecialty training (such as in epilepsy or neuromuscular disorders), the ability to understand and integrate test results into clinical decision-making remains essential for general neurology practice. EEG and neuromuscular rotations are mandatory components of various neurology programs (24, 26, 27), and specific competencies have been established for these rotations. Although EEG is an important diagnostic test in neurology, reading EEGs without supervision can be challenging owing to limited exposure and training (29). A prior study revealed that approximately 43% of neurology residents are unable to read EEGs independently (30). Educational activities related to EEG instruction vary widely across different programs (31), and the number of EEGs performed each month differs by training center (32). The availability of an epileptologist strongly influences the quality of EEG training. Therefore, a general neurologist is not expected to read an EEG independently but must be able to interpret its results. Similar principles apply to NCS and EMG tests, which require adequate exposure, usually obtained through focused subspecialty training. In our end-of-training EPAs, we emphasized the interpretation of EEG, NCS, and EMG (EPA A8) (Table 2) rather than their technical reading, as interpretation represents an essential competency for general neurologists.
Another debated domain was preventive care and counseling. While acknowledged as important, several respondents perceived this area as overlapping with the responsibilities of allied health professionals, such as nutritionists and health educators. Consequently, some EPAs related to preventive care did not reach consensus. Although preventive therapy and counseling are integral to managing neurological disorders, most respondents felt that these were not exclusive EPAs for neurologists, as they can be provided by other health specialists.
The final list of EPAs in adult neurology comprised 25 core activities representing essential skills that must be acquired by graduates of neurology programs for independent practice. The development of an educational program is a dynamic process designed to accommodate rapid changes in clinical practice. Competency-based curricula emphasize objective outcomes rather than fixed training durations (31), which can be challenging because educational programs have specific times for each rotation. Moreover, traditional curricula may not directly assess whether trainees can perform tasks unsupervised. The Saudi Board of Neurology has identified and mapped specific competencies for each training level (24). Applying an EPA-based curriculum helps integrate different competencies into clinical practice, ensuring that neurology graduates can be trusted in unsupervised settings. This concept may bridge the gap between theoretical knowledge and clinical practice (32), ultimately enhancing patient outcomes and safety.
EPAs offer several advantages over traditional competency-based frameworks. Although competencies define individual skills and knowledge, EPAs integrate them into coherent units of professional practice, thereby enabling entrustment decisions. This approach reflects the complexity of real-world clinical performance and enhances the validity of workplace-based assessments (33).
Implementing EPAs into neurology residency training programs requires a multistep process encompassing curricular mapping, faculty development, and assessment redesign. Each EPA should be aligned with existing competencies and mapped to the appropriate training year and supervision level, ranging from direct observation to independent practice and, ultimately, supervision of others. For example, early EPAs (taking a comprehensive patient history) may be expected of postgraduate year (PGY)-1 residents, whereas complex EPAs (such as interdisciplinary coordination) may apply to PGY-5 residents, who should be capable of supervising junior residents. A previously published national framework (23) has described an approach for implementing neurology EPAs, using the consolidated framework for implementing research as an organizing framework.
Standardizing the assessment of EPAs is a critical prerequisite for successful implementation. Transitioning to an EPA-based curriculum requires a paradigm shift from traditional numerical grading to entrustment decisions based on the level of supervision a trainee requires (31). This approach focuses on the readiness of the trainee to perform specific tasks independently rather than solely on knowledge acquisition. The assessment of EPAs can be effectively integrated into daily clinical practice through various strategies, including direct observation, structured knowledge testing, simulation-based evaluations, and both short- and long-term workplace assessments (31).
Faculty development is an essential step before implementing an EPA-based curriculum (34, 35) to ensure accurate entrustment decisions. Trainers must be equipped to observe EPA performance effectively and to provide constructive, actionable, and timely feedback in clinical settings (34, 35). Without adequate faculty preparation, the educational potential of EPAs may not be fully realized. Furthermore, following implementation, regular reviews and revisions of EPA-based programs are essential to accommodate ongoing changes in clinical practice, technology, and patient expectations.
Neurology training programs vary globally (36–38), and the generalizability of our results may be affected by the availability of different tests and facilities. In some countries lacking formal subspeciality training programs (39, 40), general neurologists are expected to manage complex cases and perform procedures not included in our EPA set. Moreover, although the validated EPAs in this study reflect the Saudi context, they may also inform EPA development in similar regions, particularly where subspecialty resources are limited.
Despite the methodological rigor of this study, it has some limitations. First, although efforts were made to recruit a nationally representative panel, most participants were from the Western region of Saudi Arabia, which may have limited the generalizability of the results across other regions with different training structures or resources. Second, although the panel included experienced neurologists, a substantial proportion had limited experience as formal trainers, which may have influenced their perspectives on supervision and trust. Third, a subset of the panel included early-career neurologists (3–5 years of experience), whose limited exposure to training supervision may have affected their perspective on the applicability of EPAs. Fourth, the study relied on expert consensus rather than empirical validation in clinical settings. The actual implementation and evaluation of EPAs in clinical settings, specifically, their impact on trainee performance, supervision dynamics, and patient care outcomes, were not assessed. Future studies are needed to evaluate how these EPAs function in practice and explore the inter-rater reliability of entrustment decisions. Finally, the exclusion of certain EPAs, particularly those related to pediatric neurology or advanced diagnostic testing, was based on contextual relevance and resource availability. Although appropriate for the Saudi setting, this may limit the applicability of the EPAs to other countries or institutions with a broader scope of practice.
In conclusion, this study successfully developed and validated a set of 25 EPAs tailored to neurology residency training in Saudi Arabia. These EPAs represent essential tasks that graduating neurologists should be able to perform independently and reliably in clinical settings. The finalized EPA framework provides a foundational step toward implementing a workplace-based, outcome-driven curriculum in neurology postgraduate education. By integrating EPAs into training programs, educators can better align assessments with real-world clinical responsibilities and support structured entrustment decisions.
This study contributes to the limited body of literature on region-specific EPA development in adult neurology training. Its regional relevance enhances its potential impact on national curriculum reform and serves as a model for similar initiatives across other specialties and countries with comparable training environments. However, the effective adoption of EPAs requires additional steps, including curricular mapping, faculty development, and the standardization of assessment tools aligned with supervision levels. Ongoing evaluation and refinement of EPAs are also crucial to ensure their continued relevance and applicability in evolving healthcare environments.
Data availability statement
The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.
Ethics statement
The studies involving humans were approved by the Institutional Review Board of Al-Qura University, Makkah, Saudi Arabia (Approval number: HAPO-02-K-012-2024-04-2098). The studies were conducted in accordance with the local legislation and institutional requirements. The participants provided their written informed consent to participate in this study.
Author contributions
AA: Conceptualization, Data curation, Investigation, Project administration, Visualization, Writing – original draft. AT: Data curation, Investigation, Methodology, Project administration, Visualization, Writing – original draft. LA: Investigation, Project administration, Visualization, Writing – original draft. MA: Investigation, Project administration, Writing – review & editing. MS: Investigation, Visualization, Writing – review & editing. HAli: Investigation, Visualization, Writing – review & editing. HK: Investigation, Visualization, Writing – review & editing. FA: Investigation, Visualization, Writing – review & editing. AI: Investigation, Visualization, Writing – review & editing. MD: Investigation, Visualization, Writing – review & editing. YT: Investigation, Visualization, Writing – review & editing. AB: Investigation, Visualization, Writing – review & editing. RZ: Conceptualization, Formal Analysis, Validation, Visualization, Writing – review & editing. HAlm: Conceptualization, Supervision, Validation, Visualization, Writing – review & editing.
Funding
The author(s) declare that no financial support was received for the research and/or publication of this article.
Acknowledgments
The authors would like to express their appreciation to the following neurology experts for their participation in the study: Dr. Waleed Khoja, Dr. Omar Ayoub, Dr. Hanadi Abualela, Dr. Khaled Albazli, Dr. Hind Alnajash, Dr. Hisham Aldhukair, Dr. Hanaa Kedah, Dr. Salah Baz, Dr. Hessa Alotaib, Dr. Majid Bakheet, Dr. Yassir Almatrafi, Dr. Rasha Alsubaie, Dr. Amal Moleem, Dr. Shireen Qureshi, Dr. Salman Aljarallah, Dr. Seraj Makkawi, Dr. Ahmad Abuzinadah, Dr. Waleed Alzahrani, Dr. Ahmad Abulaban, Dr. Rakan Almuhanna, Dr. Abdulaziz Alghamdi, Dr. Jihad Muglan, Dr. Jameel Rasheedi, and Dr. Ali Al-Otaibi.
Conflict of interest
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Generative AI statement
The author(s) declare that no Gen AI was used in the creation of this manuscript.
Any alternative text (alt text) provided alongside figures in this article has been generated by Frontiers with the support of artificial intelligence and reasonable efforts have been made to ensure accuracy, including review by the authors wherever possible. If you identify any issues, please contact us.
Publisher’s note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
References
1. Ten Cate, O, and Taylor, DR. The recommended description of an entrustable professional activity: AMEE guide no. 140. Med Teach. (2021) 43:1106–14. doi: 10.1080/0142159X.2020.1838465
2. Horak, H, Englander, R, Barratt, D, Kraakevik, J, Soni, M, and Tiryaki, E. Entrustable professional activities: a useful concept for neurology education. Neurology. (2018) 90:326–32. doi: 10.1212/WNL.0000000000004947
3. ten Cate, O. Entrustability of professional activities and competency-based training. Med Educ. (2005) 39:1176–7. doi: 10.1111/j.1365-2929.2005.02341.x
4. Hennus, MP, Jarrett, JB, Taylor, DR, and Ten Cate, O. Twelve tips to develop entrustable professional activities. Med Teach. (2023) 45:701–7. doi: 10.1080/0142159X.2023.2197137
5. Liu, L, Jiang, Z, Qi, X, Xie, A, Wu, H, Cheng, H, et al. An update on current EPAs in graduate medical education: a scoping review. Med Educ Online. (2021) 26:1981198. doi: 10.1080/10872981.2021.1981198
6. Valentine, N, Wignes, J, Benson, J, Clota, S, and Schuwirth, LW. Entrustable professional activities for workplace assessment of general practice trainees. Med J Aust. (2019) 210:354–9. doi: 10.5694/mja2.50130
7. Chang, A, Bowen, JL, Buranosky, RA, Frankel, RM, Ghosh, N, Rosenblum, MJ, et al. Transforming primary care training--patient-centered medical home entrustable professional activities for internal medicine residents. J Gen Intern Med. (2013) 28:801–9. doi: 10.1007/s11606-012-2193-3
8. Kerth, JL, van Treel, L, and Bosse, HM. The use of Entrustable professional activities in pediatric postgraduate medical education: a systematic review. Acad Pediatr. (2022) 22:21–8. doi: 10.1016/j.acap.2021.07.007
9. Watson, A, Leroux, T, Ogilvie-Harris, D, Nousiainen, M, Ferguson, PC, Murnahan, L, et al. Entrustable professional activities in orthopaedics. JB JS Open Access. (2021) 6:e20.00010. doi: 10.2106/JBJS.OA.20.00010
10. Montgomery, KB, Mellinger, JD, and Lindeman, B. Entrustable professional activities in surgery: a review. JAMA Surg. (2024) 159:571–7. doi: 10.1001/jamasurg.2023.8107
11. Pinilla, S, Lenouvel, E, Cantisani, A, Klöppel, S, Strik, W, Huwendiek, S, et al. Working with entrustable professional activities in clinical education in undergraduate medical education: a scoping review. BMC Med Educ. (2021) 21:172. doi: 10.1186/s12909-021-02608-9
12. de Graaf, J, Bolk, M, Dijkstra, A, van der Horst, M, Hoff, RG, and Ten Cate, O. The implementation of entrustable professional activities in postgraduate medical education in the Netherlands: rationale, process, and current status. Acad Med. (2021) 96:S29–s35. doi: 10.1097/ACM.0000000000004110
13. O'Dowd, E, Lydon, S, O'Connor, P, Madden, C, and Byrne, D. A systematic review of 7 years of research on entrustable professional activities in graduate medical education, 2011-2018. Med Educ. (2019) 53:234–49. doi: 10.1111/medu.13792
14. Weissenbacher, A, Bolz, R, Stehr, SN, and Hempel, G. Development and consensus of entrustable professional activities for final-year medical students in anaesthesiology. BMC Anesthesiol. (2022) 22:128. doi: 10.1186/s12871-022-01668-8
15. Dehghani Poudeh, M, Mohammadi, A, Mojtahedzadeh, R, and Yamani, N. Entrustability levels of general internal medicine residents. BMC Med Educ. (2021) 21:185. doi: 10.1186/s12909-021-02624-9
16. Pinilla, S, Lenouvel, E, Strik, W, Klöppel, S, Nissen, C, and Huwendiek, S. Entrustable professional activities in psychiatry: a systematic review. Acad Psychiatry. (2020) 44:37–45. doi: 10.1007/s40596-019-01142-7
17. Aldahlawi, SA, Zaini, RG, and Almoallim, HM. Entrustable professional activities (EPAs) in postgraduate periodontics programs in Saudi Arabia: a modified Delphi study. J Dent Educ. (2024) 89:1042–50. doi: 10.1002/jdd.13791
18. Alharbi, LA, Cheikh, M, Alotaibi, ME, Alkhotani, AA, Alim, HM, Almalki, F, et al. Developing and validating Entrustable professional activities (EPAs) for rheumatology fellowship training programs in Saudi Arabia: a Delphi study. Adv Med Educ Pract. (2024) 15:845–56. doi: 10.2147/AMEP.S481977
19. Alotaibi, ME, Basonbul, F, Al Dalbhi, S, Alharbi, LA, Alkhotani, AM, Alim, HM, et al. Consensus development and validation of entrustable professional activities for nephrology fellowship training in Saudi. Int J Med Educ. (2025) 16:92–9. doi: 10.5116/ijme.6819.d7ce
20. Hmoud AlSheikh, M, Zaini, RG, and Iqbal, MZ. Developing and mapping Entrustable professional activities with Saudi meds competency framework: a consensus study. Adv Med Educ Pract. (2022) 13:1367–74. doi: 10.2147/AMEP.S379184
21. Hentzen, C, Remy-Neris, O, Pradeau, C, Bensoussan, L, Boyer, FC, Daviet, JC, et al. Developing entrustable professional activities for residents in physical and rehabilitation medicine: a Delphi study. Ann Phys Rehabil Med. (2025) 68:101978. doi: 10.1016/j.rehab.2025.101978
22. Canada TRCoPaSo. (2020) Entrustable professional activities for adult neurology. Canada: The Royal College of Physicians and Surgeons of Canada.
23. Tan, N, Chan, YC, and Tan, K. Implementing neurology EPAs in Singapore using the consolidated framework for implementation research (1186). Neurology. (2020) 94:1186. doi: 10.1212/WNL.94.15_supplement.1186
24. Specialties SCfH. (2016) Saudi board neurology curriculum Saudi commission for health specialties. Available online at: https://www.scfhs.org.sa/sites/default/files/2022-01/Neurology.pdf. (Accessed February 06, 2025).
25. Ten Cate, O. Nuts and bolts of entrustable professional activities. J Grad Med Educ. (2013) 5:157–8. doi: 10.4300/JGME-D-12-00380.1
26. Neurology ABoP. (2011) Neurology core competencies outline: American board of psychiatry & neurology. Available online at: https://www.abpn.org/wp-content/uploads/2014/12/2011_core_N_MREE.pdf. (Accessed February 14, 2025).
27. Canada TRCoPaSo. (2020) Neurology competencies. Canada: The Royal College of Physicians and Surgeons of Canada.
28. Papadopoulou, E, Pepe, G, Konitsiotis, S, Chondrogiorgi, M, Grigoriadis, N, Kimiskidis, VK, et al. The evolution of comprehensive genetic analysis in neurology: implications for precision medicine. J Neurol Sci. (2023) 447:120609. doi: 10.1016/j.jns.2023.120609
29. Sadat, R, and Emrick, L. Genetic testing and counseling and child neurology. Neurol Clin. (2021) 39:705–17. doi: 10.1016/j.ncl.2021.05.003
30. Silveira-Moriyama, L, and Paciorkowski, AR. Genetic diagnostics for neurologists. Continuum. (2018) 24:18–36. doi: 10.1212/CON.0000000000000556
31. Ten Cate, O, Chen, HC, Hoff, RG, Peters, H, Bok, H, and van der Schaaf, M. Curriculum development for the workplace using Entrustable professional activities (EPAs): AMEE guide no. 99. Med Teach. (2015) 37:983–1002. doi: 10.3109/0142159X.2015.1060308
32. ten Cate, O, and Scheele, F. Competency-based postgraduate training: can we bridge the gap between theory and clinical practice? Acad Med. (2007) 82:542–7. doi: 10.1097/ACM.0b013e31805559c7
33. Ten Cate, O. Competency-based education, entrustable professional activities, and the power of language. J Grad Med Educ. (2013) 5:6–7. doi: 10.4300/JGME-D-12-00381.1
34. Shorey, S, Lau, TC, Lau, ST, and Ang, E. Entrustable professional activities in health care education: a scoping review. Med Educ. (2019) 53:766–77. doi: 10.1111/medu.13879
35. Bray, MJ, Bradley, EB, Martindale, JR, and Gusic, ME. Implementing systematic faculty development to support an EPA-based program of assessment: strategies, outcomes, and lessons learned. Teach Learn Med. (2021) 33:434–44. doi: 10.1080/10401334.2020.1857256
36. Belay, HD, Gebrewold, MA, Ayele, BA, Oda, DM, Kelemu, FT, Zewde, YZ, et al. Neurology training and medical education in resource-limited settings: building and growing the first neurology residency program in East Africa. Semin Neurol. (2024) 44:147–58. doi: 10.1055/s-0044-1785539
37. Kleineberg, NN, van der Meulen, M, Franke, C, Klingelhoefer, L, Sauerbier, A, Di Liberto, G, et al. Differences in neurology residency training programmes across Europe - a survey among the residents and research fellow section of the European academy of neurology national representatives. Eur J Neurol. (2020) 27:1356–63. doi: 10.1111/ene.14242
38. Nascimento, FA, Gavvala, JR, Tankisi, H, and Beniczky, S. Neurology resident EEG training in Europe. Clin Neurophysiol Pract. (2022) 7:252–9. doi: 10.1016/j.cnp.2022.08.001
39. Tamás, G, Fabbri, M, Falup-Pecurariu, C, Teodoro, T, Kurtis, MM, Aliyev, R, et al. Lack of accredited clinical training in movement disorders in Europe, Egypt, and Tunisia. J Parkinsons Dis. (2020) 10:1833–43. doi: 10.3233/JPD-202000
Keywords: entrustable professional activities, EPAS, neurology, Saudi Arabia, curriculum, residency education, professional tasks, expert consensus
Citation: Alkhotani AM, Tawakul A, Alharbi LA, Alotaibi ME, Samannodi MS, Alim HM, Khadawardi HA, Almalki F, Imam AA, Dairi MS, Turkistani YA, Bulkhi AA, Zaini R and Almoallim HM (2025) Developing and validating entrustable professional activities for neurology residency training programs in Saudi Arabia: a modified Delphi study. Front. Neurol. 16:1688620. doi: 10.3389/fneur.2025.1688620
Edited by:
Paolo Ragonese, University of Palermo, ItalyReviewed by:
Konstantinos Dimitriadis, LMU Munich University Hospital, GermanyShane Stone, University of California, Davis, United States
Copyright © 2025 Alkhotani, Tawakul, Alharbi, Alotaibi, Samannodi, Alim, Khadawardi, Almalki, Imam, Dairi, Turkistani, Bulkhi, Zaini and Almoallim. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Amal M. Alkhotani, YW1raG90YW5pQHVxdS5lZHUuc2E=
Abdullah Tawakul1