- Department of Neurosurgery, Barrow Neurological Institute, St. Joseph’s Hospital and Medical Center, Phoenix, AZ, United States
Introduction: Extended reality (XR) technology may play an important role in progressing the field of skull base surgery. Its potential use in neurosurgical training, case preparation, and the operating room could make XR a powerful addition to the surgical toolbox. This study evaluated the application of XR in skull base surgery.
Methods: A systematic literature search from inception to March 28, 2024, was performed using 4 databases: PubMed, Scopus, Web of Science Advance, and Embase (Ovid). Original studies involving the use of XR in skull base surgery for surgical planning or training purposes were included. Conference abstracts, reviews, and case reports were excluded.
Results: Of 357 articles screened across all 4 databases, 250 were included. After careful evaluation of titles and abstracts for eligibility, 29 articles were deemed suitable for full-text examination. A subsequent detailed assessment excluded 8 articles, resulting in a final 21 studies that met the criteria for inclusion in the systematic review. Of the 21 studies included, 13 (62%) focused on augmented reality, 4 (19%) focused on virtual reality, and 4 (19%) focused on mixed reality. Augmented reality has demonstrated varying degrees of effectiveness, with mean registration accuracy reported between 2.5 and 10.75 mm. The mean (SD) registration error reported in mixed reality was 5.76 (0.54) mm. Virtual reality has been used for preoperative planning and intraoperative guidance, with average computation times ranging from 15 s to 2 min.
Discussion: The role of XR in skull base surgery is anticipated to grow, given its potential for streamlining surgical planning, neuronavigation, and teaching. Although the use of XR in skull base surgery shows promise, the technologies associated with these modalities require substantial improvement before XR is a stable component of the neurosurgical toolbox.
1 Introduction
In 1935, science fiction writer Stanley Weinbaum presented readers with a fascinating new premise in his novel Pygmalion's Spectacles: What is it like to explore a new world using a pair of goggles? (1) Although appearing in a fictional story, this idea planted the seed for the conception of extended reality (XR), or any form of technology that uses digital elements to alter perception of a real-world environment (2). There are 3 primary forms of XR. Augmented reality (AR) involves digital elements that are layered over a physical environment to create a composite image (3). Mixed reality (MR) is an extension of AR in which users can interact with digital elements that are overlaid on a physical environment (3). Lastly, virtual reality (VR) enables individuals to immerse themselves in a fully digital environment and interact with it (4). In recent years, XR has been introduced to a variety of industries, including entertainment, manufacturing, and healthcare, primarily due to breakthroughs in software and graphic design, as well as significant hardware improvements (5–7). Given the current applications and unique potential of XR in these industries and sectors, it is unsurprising that XR has also begun to impact the field of skull base surgery.
Skull base surgery is a highly specialized field of neurosurgery and otolaryngology (8). Due to the presence of vital and eloquent structures in the head, face, and neck, the skull base has traditionally been one of the most difficult areas to access surgically (9, 10). Skull base approaches are segregated into 2 categories: open and endoscopic (11). Traditional open skull base approaches involve a more extensive craniotomy, and endoscopic surgery offers alternative approaches that are typically associated with lower morbidity and complication rates (12–14).
Because of their complexity, skull base procedures require an immense degree of precision and technical expertise. Any advancements in XR technologies that can provide surgeons with the most up-to-date training and guidance are welcomed. The use of XR in skull base surgery can be segmented into 2 roles: training, and perioperative and intraoperative aid. This study reviewed the use of XR technology in each of these roles and evaluated the future implications of XR in skull base surgery.
2 Methods
2.1 Search strategy
The PubMed, Scopus, Web of Science Advance, and Embase (Ovid) databases were retrospectively queried to identify all relevant studies from inception to March 28, 2024, using a search string with the following keywords: “extended reality,” “augmented reality,” “virtual reality,” “mixed reality,” and “skull base surgery” (Table 1). Individuals of all ages were considered for inclusion. Studies that were either unrelated to XR or did not include patients undergoing skull base procedures were excluded. Animal studies, reviews, and nonoriginal research articles were also excluded from analysis.
2.2 Screening of studies
The initial title and abstract screening was led by 2 authors (V.S. and E.N.). Any discrepancies were addressed through consultation with a third co-author (A.B.) and further discussion with all authors. The screening of studies adhered to Preferred Reporting Items for Systematic Reviews and Meta Analyses guidelines (Figure 1).
Figure 1. Preferred Reporting Items for Systematic Reviews and Meta-Analyses flow diagram summarizing the article screening process. Used with permission from Barrow Neurological Institute, Phoenix, Arizona.
2.3 Data extraction
Three independent authors (V.S., E.N., and A.B.) extracted relevant data from the selected studies. The data collected included study design, participant demographics, and the number of participants with respective outcomes and complications. Discrepancies in data extraction were resolved through consensus, and any unresolved disagreements were addressed through consultation with a fourth reviewer (A.T.E.). The results are summarized and discussed on the basis of the salient features extracted from the studies in the AR, MR, and VR categories. We included ventriculostomy and external ventricular drain simulations under “skull base surgery” due to their anatomical proximity to the skull base and their relevance in training to manage complications associated with skull base interventions. However, due to the limited data and insufficient published research, we were unable to determine the statistical significance among the various XR technologies and performance outcomes in skull base neurosurgery.
3 Results
3.1 Study selection
A total of 357 articles underwent initial screening, after which 107 duplicate articles were identified and subsequently removed. Careful evaluation of titles and abstracts yielded 29 articles that were deemed suitable for full-text examination. A detailed assessment excluded 8 articles; thus, 21 studies met the inclusion criteria and were incorporated into our study (Tables 2–4) (15–35).
3.2 Study characteristics
Of the 21 studies included, 13 focused on AR, and 4 each focused on VR and MR. The detailed characteristics of each study are presented in Tables 2–4 (15–35).
3.3 AR in skull base surgery
Blending AR with real-world intraoperative imaging and navigation systems allows visualization of critical neuroanatomical structures in real time and can aid in the resection of skull base pathologies (36). For example, in a report from 2018, Umebayashi et al. described employing AR-enhanced navigation maps to guide electric drills during procedures (37). Similarly, in 2020, Jean demonstrated the use of 3-dimensional (3D) VR renderings for preoperative rehearsal and intraoperative guidance in a minipterional craniotomy approach, highlighting AR's capacity to improve surgical accuracy and outcomes by displaying the anticipated surgical opening as an overlay over the patient in real time (38).
In 2023, Goto et al. evaluated a novel AR navigation system for endoscopic transsphenoidal surgery, and its accuracy was measured in 15 consecutive patients using a 5-point scale, on which a 4 represented “as useful as conventional neuronavigation,” and a 5 represented “more useful than conventional neuronavigation.” In their study, the AR navigation received a mean score of 4.7 (95% CI: 4.58–4.82), indicating that AR navigation was more useful than conventional navigation in most cases. However, depth perception of the lesion was more difficult with AR navigation in 2 cases (a skull base chondrosarcoma and craniopharyngioma) (18). In 2023, El Chemaly et al. assessed stereoscopic calibration for AR in microsurgery, in which the intrinsic (e.g., principal point, focal length, lens distortion) and extrinsic (location and orientation) parameters of the surgical stereo microscope were calculated using a calibration board that was 3D printed and electromagnetically tracked. The electromagnetic tracker coordinate system and the stereo microscope image space were transformed using these parameters such that any tracked 3D point could be projected onto the left and right visuals of the microscope video stream. Calibration was done once the system was set up, and an average calibration workflow time of 55.5 s and an average computational time of 10.5 s were reported, both being critical components of operational efficiency (16).
The microscope can be efficiently precalibrated at multiple focal lengths and the corresponding intrinsic parameters can be obtained. Zooming or changing the focal length of the microscope necessitates recalibration; thus, algorithms that provide automatic camera parameter calibration from a single image taken in any position increase efficiency. In clinical settings, this alternate technique allows for rapid recalibration if the focal length is changed, decreasing the total time required to adjust the surgical equipment.
In 2023, Ansari et al. introduced the VentroAR platform for ventriculostomy, with metrics showing a mean (SD) targeting accuracy of 10.64 (5.0) mm, which was measured in terms of the distance of the user's hitting point to the target or the total error. The mean (SD) targeting accuracy of the free hand group after training was 16.7 (5.4) mm in a 2021 study by Van Gestel et al. (39). In a 2022 study by Steiert et al., AR-assisted cranioplasty was employed for single-step resection and cranioplasty, although detailed quantitative results were sparse (21). In a 2022 study, Pojskic et al. combined AR with intraoperative computed tomography navigation for meningioma resections, showing promising outcomes in terms of neuronavigation accuracy and resection success (20). Gross total resection was achieved in 26 of 39 (66.7%) patients, a rate the authors describe as consistent with the published data for similar skull base meningiomas. The accuracy of AR systems is often quantified using target registration error (TRE), which measures the alignment between projected and actual anatomical structures. Automatic registration applying intraoperative computed tomography resulted in high accuracy in registering a target point to a model, which had a mean (SD) TRE of 0.82 (0.37) mm. This measurement is dramatically lower than landmark-based neuronavigation, which had a mean (SD) TRE of 1.85 (1.02) mm, as shown in the study by Bopp et al. (40). Additional AR studies described various applications and outcomes, including a study on retrosigmoid craniotomy planning and AR integration with endoscopic views, underscoring AR's diverse applicability in enhancing surgical precision and workflow (19, 22).
AR technologies have demonstrated varying degrees of accuracy depending on the specific applications, with studies reporting TRE as low as 0.82 mm with the use of intraoperative computed tomography. Reported calibration times in AR studies range from 0.11 to 0.94 s. Although to date few studies have reported improved outcomes associated with AR-assisted surgery, it is important to note that surgical outcomes are not consistently reported across all studies. Among the studies in which outcomes were detailed, there were 60 gross total resections (18, 20, 21), 3 near-total resections (18), 15 subtotal resections (18, 20, 21), 10 partial resections (20, 21), and 5 biopsies [including 2 open biopsies (20, 21) and 3 intraoperative biopsies (21)]. Although specific complication rates are reported far less frequently, the general efficacy of AR in improving surgical navigation and outcomes is well-documented. For example, AR-assisted cranioplasty and endoscopic transsphenoidal surgery show promising results in enhancing accuracy and reducing procedural errors (18).
However, this advancement in AR-guided neuronavigation comes with an added financial burden to the healthcare facility. A study by Davis et al. showed that the total cost of setting up Virtual Interactive Presence in Augmented Reality for a year would be $14,930.39 (41). The Vuzix smart spectacles and the Proximie system retail for $17,000 and $1,300–$2,500 annually, respectively (42). AR software licenses cost $7,000, and hardware costs $2,200, based on the study by Vyas et al. (43). Even though these rates are specific for each region, the cost of a number of these characteristics differs based on the location and the limited supply of goods and services in the area.
3.3 Role of AR as an educational tool in skull base surgery
3.3.1 Ventriculostomy
Several studies have investigated the utility of AR in the training of ventriculostomy (39, 44, 45). In one study, 3D-reconstructed ventricles were superimposed on a phantom head using a smartphone display, whereas in 2 other experiments, a head-mounted display was used. One study directly contrasted AR training with conventional training techniques (39). In comparison with the freehand group [mean (SD) TRE: 19.9 (4.2) mm, p = 0.003], the AR group's external ventricular drain placement was noticeably more accurate [mean (SD) TRE for the untrained group: 11.9 (4.5) mm] (39). Additionally, those who underwent AR training had considerably better insertion performances (59.4% modified Kakarla scale grade 1) than those who received freehand training (25.0% modified Kakarla scale grade 1, p = 0.005) (39). Schneider et al. detailed a comparable AR-based external ventricular drain installation training simulator in a different study (44). 3D ventricles were registered using a head-mounted display and QR codes affixed to a model skull. Although freehand insertion was not used as a control group in this investigation, the authors observed a mean (SD) TRE of 2.71 (1.18) mm and a ventricle hit rate of 68.2% when using AR (44).
3.3.2 Neuroanatomy teaching
A small number of studies examined how well AR interventions worked in comparison with more conventional approaches for teaching neuroanatomy, but the majority were not created with skull base surgical training in mind (46–50). Written tests were used to evaluate knowledge results in 4 out of 5 trials. Only 1 study found that the AR intervention increased understanding of neuroanatomy (46). One study revealed that traditional 2-dimensional learning approaches yielded more learning gain than AR-based methods, whereas 2 studies found no effect of AR training on examination scores (47, 49). Post-intervention surveys and satisfaction ratings showed that participants preferred learning neuroanatomy with AR-based approaches, despite the equivocal effect of AR on neuroanatomy examination performances.
3.4 MR in skull base surgery
A 2024 study by Marrone et al. compared standard magnetic neuronavigation (Medtronic Stealth Station S8) with MR neuronavigation under different lighting conditions (30). Although detailed quantitative results were not provided, MR navigation was noted to enhance visualization and potentially improve surgical accuracy. In another study, Eom et al. investigated MR-guided external ventricular drain placement, in which user trials consisted of 1 blind and 1 MR-assisted drilling of a burr hole followed by a routine, unguided external ventricular drain catheter placement for each of 2 different drill bit sizes. The mean (SD) distance from the catheter target improved from 18.6 (12.5) mm to 12.7 (11.3) mm (p < 0.001) using MR guidance for trials with a large drill bit and from 19.3 (12.7) mm to 10.1 (8.4) mm with a small drill bit (p < 0.001) (25). This study found that real-time quantitative and visual feedback of an MR-guided burr hole procedure can independently improve procedural accuracy. Zeiger et al. reported the utility of MR in various endoscopic skull base procedures, including pituitary tumors and anterior skull base meningiomas, in a cohort of 134 patients (35). Surgeons in this series reported that the technology was particularly useful in less commonly used surgical approaches, such as transorbital surgery. The series reported no intraoperative complications, although some patients did report postoperative complications following surgery of the anterior skull base (35). A 2018 study by McJunkin et al. focused on the use of MR for lateral skull base anatomy, emphasizing the potential of MR to enhance anatomical understanding and precision in temporal bone surgery (31).
The mean (SD) registration error reported in all included MR studies is 5.76 (0.54) mm (35). It is usually measured as the average distance between the virtual object placed in the real world and its intended position, often using specific markers or reference points. In a series of 118 tumor surgeries using MR, reported outcomes included a postoperative death due to respiratory failure (1.5%), hospital readmission (6.7%), and other issues such as cerebrospinal fluid leaks and visual defects, not necessarily due to the use of MR itself. In these cases, resection outcomes varied, with gross total resection achieved in approximately 56.3% of cases and subtotal resection achieved in 43.7% of cases (35). Despite some variability in procedural details and accuracy metrics, MR demonstrates promising potential for enhancing surgical precision and improving patient outcomes.
3.5 Role of MR as an educational tool in skull base surgery
3.5.1 Brain tumor surgery
The viability of using an MR device for neurosurgery education was investigated by Jain et al. using 3 case scenarios, including that of a sphenoid wing meningioma (51). The study also assessed the trainees' experiences with the MR platform. The study involved the recruitment of 8 neurosurgeon trainees. For most trainees, the learning curve was low even though they had never used an MR platform before. The trainees' answers to the question of whether MR could replace the conventional approaches currently used in teaching neuroanatomy were split. The trainees rated the device as appealing, dependable, innovative, and easy to use, as shown by the user experience questionnaire results.
3.5.2 Neurosurgery teaching
In a quasi-experimental study of 223 medical students (120 in the conventional group and 103 in the MR group), Isidre et al. sought to determine whether an MR-guided neurosurgical simulation module in the context of an undergraduate neurosurgical hands-on course could increase medical student satisfaction (52). The mean (SD) satisfaction scores for the conventional group [89.3 (13.3)] and MR group [94.2 (7.5)] indicated that using MR-simulation was linked to higher levels of satisfaction.
3.6 VR in skull base surgery
VR technology has been used for preoperative planning and intraoperative guidance in skull base surgery. A study by Filimonov et al. explored VR applications for endoscopic transnasal approaches to the craniovertebral junction, noting that VR systems provided valuable preoperative insights, including safe surgical planning and visualization of critical structures, despite the small sample size (n = 5) (26). A 2020 study by Liu and Yi identified a new bony landmark (point O, which is the junction point of the temporosphenoid suture and the infratemporal ridge) for lateral skull base surgery using VR, focusing on anatomical precision in cadaveric studies, and the distances measured were in agreement with the actual distances obtained by cadaveric dissections. The mean (SD) dissection vs. mean (SD) VR 3D stereoscopic image was 22.52 (2.47) vs. 22.42 (2.85) for point O to the foramen rotundum (p = 0.81), 22.62 (2.60) vs. 23.51 (2.09) for point O to the foramen spinosum (p = 0.67), 23.69 (2.34) vs. 22.77 (2.90) for point O to the foramen ovale (p = 0.70), and 24.42 (2.38) vs. 24.19 (2.65) for point O to the superior orbital fissure (p = 0.59) (29). The early feasibility study by Won et al. highlighted VR's potential in rehearsing complex endoscopic skull base procedures, with a reported simulation time between 1 and 2 h depending on the number and complexity of the segmented structures, concurrently allowing the surgeon to begin a virtual dissection of the sinuses and ventral skull base (34). In a 2006 study, Rosahl et al. demonstrated the application of VR in enhancing the surgical field through image guidance, although detailed statistics on the impact of VR on surgical outcomes were not extensively documented (33).
The average computation times for VR simulations ranged from 15 s to 2 min, reflecting variability based on the surgeon's experience in this emerging modality. Complications reported in these studies were not due to the VR systems themselves but included intraoperative cerebrospinal leaks and prolonged intubation, although detailed rates were not uniformly provided. Although VR studies do not consistently report specific accuracy metrics, the role of VR in enhancing surgical planning and visualization is evident.
3.7 Role of VR as an educational tool in skull base surgery
In training environments, high-fidelity VR systems enable surgeons to practice procedures on virtual patients, providing scalable and cost-effective alternatives to traditional cadaveric models (53). These fully digitized surgeries also introduce new methods for assessing surgical skills by capturing performance metrics that are challenging to measure in real-world scenarios (54).
3.7.1 Neuroanatomy education
A 2014 study by Arora et al. examined the viability of performing case-specific surgical rehearsal using a VR temporal bone simulator (55). They conducted 3 dissection tasks on the case simulation and cadaver models after completing a 90-min temporal bone dissection on the general simulation model. The usefulness of VR as an educational tool is demonstrated by the high ratings (Likert score >4) given to case rehearsal for confidence (75%), training (94%), and planning facilitation (75%).
Another study consisting of 16 otolaryngology and head-and-neck surgical residents reported a significant increase in overall confidence in 87.5% of the participants after conducting an anatomy-specific VR rehearsal (56). Similar studies have demonstrated the role of VR simulation as an effective training aid for temporal bone surgery (57–60). In a pilot study, Munawar et al. reported on the use of the Fully Immersive Virtual Reality System for skull-base surgery, which combines high-fidelity hardware with surgical simulation software, to perform virtual cortical mastoidectomy (54). There were 7 participants, including 3 attending surgeons, 3 residents, and a medical student. The preliminary data collected by this system distinguished between participants with varying levels of ability, showing a great promise for automatic skill assessment.
3.7.2 Skull base tumor surgery
In a study by Shao et al., 30 undergraduate students were randomly divided into a virtual reality teaching group and a traditional teaching group for a set of 10 cases of skull base tumors (61). A comparison of the 2 groups showed that the virtual reality teaching group had a better response effect than the traditional teaching group. Response effect was a knowledge assessment comparing the 2 groups. The VR group outperformed the traditional instruction group in terms of basic theory, location, adjacent structure, clinical manifestation, diagnosis and analysis, surgical procedures, and overall scores.
3.8 XR in skull base surgery
The comparative analysis of XR, based on the limited literature available, found that the pooled mean registration error was 5.76 mm for MR, 6.0 mm for VR, and 6.5 mm for AR (31). Integrating MR, VR, and AR technologies into skull base surgery aims to significantly enhance surgical precision and planning, with no evidence suggestive of increased complications. Overall, MR technology was found to offer improved registration accuracy and resection outcomes, VR technology was found to enhance preoperative planning and simulation, and AR technology was found to provide real-time neuronavigation improvements. The role of XR as an educational tool in skull base surgery is summarized in Table 5 (39, 44, 46–52, 54–56, 61).
4 Discussion
This systematic review included 21 articles and aims to provide an overview of the current research on XR utilization in skull base surgery. To facilitate improved digital representations for training and surgical planning, neurosurgery and otolaryngology clinical practices have begun employing AR, MR, and VR technologies. These XR technologies have been safely used to explore the operative field from various angles and visualize the neuroanatomy that is difficult to appreciate in the surgical field. This ability has improved the comprehensive sensory experience, particularly when using keyhole approaches to deeply situated targets (36, 37). AR not only facilitates accurate planning but also improves depth perception and reduces risks by superimposing segmented anatomical structures onto the operative field (29, 53, 62–70).
Given the dependence of AR on tracking strategies, imaging modalities, and the availability of anatomical registrations (16, 27, 71–73), Citardi et al. (74) recommended next-generation AR systems aim for a TRE of 0.6–1.5 mm for optimal performance. Birlo et al. (75) reviewed the use of optical see-through head-mounted displays in AR surgery and showed that they are mostly used in orthopedic surgery (28.6%), with a primary role in surgical guidance. The usefulness of optical see-through head-mounted displays was substantially affected by human variables. Clinical trials have revealed that the benefits of these devices are insufficient. They are not yet well-established in operating rooms. Their clinical utility would be ensured by a concentrated effort to resolve technical registration and perceptual variables in the laboratory as well as by a design that integrates human-factor considerations to address obvious clinical concerns. Despite challenges such as device bulkiness, limited battery life, and issues with legal implementation (76–78), AR continues to evolve, offering transformative potential in surgical simulations, training, and patient-specific interventions (38). These advancements highlight AR's promise in enhancing neurosurgical outcomes while emphasizing the need for refined technologies and streamlined integration.
MR is emerging as a transformative tool in skull base surgery and medical education. MR headsets combine spatial mapping, a hands-free interface, and MR vision to enable image-guided navigation. MR headsets use stereoscopy to project 3D models onto a transparent lens, allowing users to see holographic overlays without losing focus on their physical surroundings. This capability has been effectively utilized in surgical simulations, significantly enhancing neurosurgery and otolaryngology resident training (79–82). For example, a 2014 study by Hooten et al. demonstrated the educational value of MR through a ventriculostomy simulation in which virtual catheters were projected onto virtual ventricular models, offering an immersive learning experience (83). MR platforms are also revolutionizing intraoperative guidance, as Zeiger et al. showed in a study of endoscopic endonasal skull base surgeries using patient-specific 3D reconstructions (35), in which gross total resection was achieved in 56.3% of the cases. Furthermore, emerging technologies like Quicktome and Infinitome (Omniscient Neurotechnology) integrate Human Connectome Project data with MR for advanced surgical planning and training (84). These platforms, coupled with innovations such as Visualase (Medtronic), robotic auto-guide therapies, and stereoelectroencephalography-based implantation are advancing surgical precision and reducing morbidity (85). In a recent study by Bronowicki et al., who used the CarnaLife Holo system (MedApp S.A., Poland), the effects of MR on the lengths of the surgical procedure and hospitalization were evaluated in a surgically treated pediatric oncology cohort. The study compared the results of procedures with and without MR and found that hospitalization periods stayed within normal ranges and that surgical procedure length in the MR group was comparable to that for patients treated with traditional techniques. In a small number of cases performed with MR, a slight increase in procedure time was noted; however, this increase did not considerably lengthen the surgical procedure or hospitalization duration (86). With continuous improvements in computing power and machine learning, MR technologies are poised to further enhance skull base surgical care, improving both clinical outcomes and the educational landscape.
VR has emerged as a transformative tool in skull base surgery, offering detailed 3D models that enhance surgical planning, training, and intraoperative guidance. Introduced to the surgical discipline by Satava in 1993 (87), VR provides stereoscopic reconstructions of the surgical field, allowing surgeons to visualize complex anatomical relationships with depth and clarity (87–89). Techniques such as the 3D multifusion volumetric imaging, which integrates digital subtraction angiography, magnetic resonance imaging, and computed tomography, have demonstrated utility in preoperative planning for skull base tumors, improving approach selection and anticipation of intraoperative challenges (90). Recent studies, such as the 2021 study from Zawy Alsofy et al. (91), highlight VR's ability to optimize tumor-related anatomical visualization, refine surgical angles, and enhance safe and effective tumor resections. Additionally, Rosahl et al. combined VR with real-time infrared guidance in their virtual operating field to improve intraoperative anatomical navigation during skull base surgeries (33).
VR has also proven to be a valuable educational tool, enhancing surgical training through realistic anatomical simulations (92–97). Further studies, such as those using Stanford University's CardinalSim software for middle cranial skull base approaches, have demonstrated significant improvements in surgical proficiency, including reduced critical errors and faster procedure times (98). Innovations in photogrammetry have further advanced VR's educational applications, producing high-fidelity, photorealistic 3D models for neuroanatomical training (99, 100). For example, Corvino et al. created interactive models of the sellar region, enabling self-guided exploration and fostering a deeper understanding of the complex anatomy in the region (101). As these technologies evolve, VR is expected to become a cornerstone of skull base surgical education and practice, complementing cadaveric dissections and live surgeries while advancing precision and safety in patient care.
Surgeons could also use these systems to generate 3D images of anatomical structures in preoperative and intraoperative settings to provide crucial procedural guidance (102). Research has shown that these XR-based tools can enhance surgical outcomes and improve patient safety (102). Although it is evident that XR can play a critical role in skull base neurosurgery, it is important to fully understand the multiple ways in which it can be used. By identifying the strengths and gaps in this technology, engineers and clinicians can find novel ways to innovate upon these XR systems with the ultimate goal of improving patient outcomes and quality of life.
Findings from studies based on the cost-effectiveness of XR technologies in low- and middle-income countries (LMICs) showed positive and negative results. Stereoscopic XR proved to be helpful for new residents in obstetric emergency training according to a study by Bailey et al. (103). However, that study also pointed out that the use of XR in LMICs is constrained by its reliance on technology infrastructure. Although 83.8% of Pakistani healthcare professionals supported the educational benefits of XR, 70% of them cited technology infrastructure as a significant drawback, according to research by Khan et al. (104). Medical education in LMICs confronts two primary challenges according to a study by Li et al. (105). These challenges include a lack of resources and technology constraints. According to Mondal (106), India has conducted hardly any XR research (1.7% AR and 2.2% VR studies), although investigators also suggest more research be done. Developing cost-effective, open-source solutions along with policies promoting digital infrastructure and affordable tools can expand access, especially in the setting of LMICs (107, 108). Another potential strategy is to implement prototyping in these settings. XR prototyping can assist designers in producing low-cost iterative designs before manufacturers and investors decide to engage in research and development and manufacturing. XR prototyping is also more user-friendly than computer-based 3D modeling and displays 3D structures better than manual two-dimensional drawings (109).
VR is often associated with a steep learning curve, which limits a surgeon's ability to apply their preoperative knowledge in real-world patient settings (110). Long adaptive and training periods might be an additional hindrance to the smooth adoption of this technology into clinical practice. Overall, XR's incorporation into surgical planning and intraoperative guidance represents a paradigm shift in the way surgeons approach clinical practice. Traditional surgical training is successful, but it frequently follows a set format in which trainees move through predetermined learning phases at their own pace. A more flexible, individualized method would replace this rigidity with artificial intelligence-enhanced XR simulations, enabling surgeons to train at their own speed and concentrate on their most vulnerable areas. This customized training may greatly speed the development of skills and proficiency, allowing surgeons to become competent more quickly and confidently (111).
As XR technologies develop and become more integrated into healthcare procedures, it will be important to recognize and treat the possible long-term clinical effects, both positive and negative. It is important to carefully plan and execute XR solutions to optimize advantages, reduce risks, and guarantee equitable access and ethical use. Standardized outcome reporting and continuous monitoring are necessary for future studies in order to completely understand and reduce the possible long-term impacts of XR on clinicians' health and well-being. Future developments should focus on cost-effective, user-friendly designs, leveraging artificial intelligence and photogrammetry to optimize 3D modeling and XR systems. Collaborative efforts and prospective research are crucial for refining these technologies, integrating them seamlessly into surgical workflows, and improving surgical outcomes while ensuring patient safety. Table 6 provides a brief summary of the role of XR in skull base surgery.
4.1 Limitations
Despite their transformative potential, XR applications in skull base surgery face several limitations. “Crowding of objects” in digital overlays can overwhelm surgeons and reality algorithms, necessitating interfaces that prioritize critical information to reduce cognitive overload (112). Additionally, head-mounted devices remain cumbersome, causing physical fatigue during extended procedures, which could potentially reduce accuracy and negatively affect outcomes; lighter ergonomic designs and alternative display technologies, such as integrated operating room systems, are needed (113, 114). Oculomotor abnormalities, nausea, disorientation, and discomfort were among the common negative effects of virtual reality shown by a recent systematic study by Cossio et al. (115). Furthermore, current magnetic resonance scanning resolution limits accurate anatomical segmentation, although advancements in artificial intelligence and high-resolution imaging could address these challenges (116). Ensuring precise overlays demands extensive hardware-software integration, with ongoing efforts required to enhance system reliability, minimize latency, and improve calibration (117, 118). Data security, data management, and the potential for attention bias further underscore the importance of tailored interfaces and robust encryption methods (110). The cost associated with establishing these technologies also poses a significant challenge, especially in hospitals in LMICs.
5 Conclusion
The integration of XR technologies into skull base surgery has the potential to change surgical planning, training, teaching, and intraoperative guidance. These technologies enhance visualization of complex anatomical structures, improve surgical precision, and provide immersive educational tools that complement traditional training methods. Despite limitations such as poor device ergonomics, resolution constraints, and the need for robust system integration, ongoing advancements in artificial intelligence, photogrammetry, and imaging modalities promise to address these challenges. By fostering interdisciplinary collaboration, developing user-friendly interfaces, and ensuring rigorous evaluation of clinical outcomes, XR systems can be seamlessly integrated into skull base surgical workflows.
Data availability statement
The original contributions presented in the study are included in the article, and further inquiries can be directed to the corresponding author.
Author contributions
VS: Conceptualization, Data curation, Writing – original draft, Writing – review & editing. EN: Conceptualization, Data curation, Writing – original draft. AB: Conceptualization, Formal analysis, Writing – original draft. AE: Conceptualization, Data curation, Writing – original draft, Writing – review & editing. SR: Formal analysis, Writing – original draft. KJ: Writing – original draft. MM: Writing – original draft, Writing – review & editing. JC: Writing – review & editing. AH: Supervision, Writing – review & editing. RR: Supervision, Writing – review & editing. ML: Supervision, Writing – review & editing.
Funding
The author(s) declare that no financial support was received for the research and/or publication of this article.
Conflict of interest
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
The reviewer SH declared a past collaboration with the authors RR and ML to the handling editor at the time of review.
Generative AI statement
The author(s) declare that no Generative AI was used in the creation of this manuscript.
Any alternative text (alt text) provided alongside figures in this article has been generated by Frontiers with the support of artificial intelligence and reasonable efforts have been made to ensure accuracy, including review by the authors wherever possible. If you identify any issues, please contact us.
Publisher's note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
Abbreviations
3D, 3-dimensional; AR, augmented reality; MR, mixed reality; TRE, target registration error; VR, virtual reality; XR, extended reality.
References
1. Chen SY, Lai YH, Lin YS. Research on head-mounted virtual reality and computational thinking experiments to improve the learning effect of aiot maker course: case of earthquake relief scenes. Front Psychol. (2020) 11:1164. doi: 10.3389/fpsyg.2020.01164
2. Logeswaran A, Munsch C, Chong YJ, Ralph N, McCrossnan J. The role of extended reality technology in healthcare education: towards a learner-centred approach. Future Healthc J. (2021) 8(1):e79–84. doi: 10.7861/fhj.2020-0112
3. Park BJ, Hunt SJ, Martin C III, Nadolski GJ, Wood BJ, Gade TP. Augmented and mixed reality: technologies for enhancing the future of IR. J Vasc Interv Radiol. (2020) 31(7):1074–82. doi: 10.1016/j.jvir.2019.09.020
4. Kyaw BM, Saxena N, Posadzki P, Vseteckova J, Nikolaou CK, George PP, et al. Virtual reality for health professions education: systematic review and meta-analysis by the digital health education collaboration. J Med Internet Res. (2019) 21(1):e12959. doi: 10.2196/12959
5. Cipresso P, Giglioli IAC, Raya MA, Riva G. The past, present, and future of virtual and augmented reality research: a network and cluster analysis of the literature. Front Psychol. (2018) 9:2086. doi: 10.3389/fpsyg.2018.02086
6. Garrett B, Taverner T, Gromala D, Tao G, Cordingley E, Sun C. Virtual reality clinical research: promises and challenges. JMIR Serious Games. (2018) 6(4):e10839. doi: 10.2196/10839
7. Willaert WI, Aggarwal R, Van Herzeele I, Cheshire NJ, Vermassen FE. Recent advancements in medical simulation: patient-specific virtual reality simulation. World J Surg. (2012) 36(7):1703–12. doi: 10.1007/s00268-012-1489-0
8. Rangel-Castilla L, Russin JJ, Spetzler RF. Surgical management of skull base tumors. Rep Pract Oncol Radiother. (2016) 21(4):325–35. doi: 10.1016/j.rpor.2014.09.002
9. Franz L, Zanoletti E, Nicolai P, Ferrari M. Treatment of skull base diseases: a multidisciplinary challenge. J Clin Med. (2023) 12(4):1492. doi: 10.3390/jcm12041492
10. Tzelnick S, Rampinelli V, Sahovaler A, Franz L, Chan HHL, Daly MJ, et al. Skull-Base surgery-a narrative review on current approaches and future developments in surgical navigation. J Clin Med. (2023) 12(7):2706. doi: 10.3390/jcm12072706
11. Soffer JM, Ulloa R, Chen S, Ziltzer RS, Patel VA, Polster SP. A comparison of endoscopic endonasal versus open approaches for skull base chordoma: a comprehensive national cancer database analysis. Neurosurg Focus. (2024) 56(5):E5. doi: 10.3171/2024.2.FOCUS23933
12. Alshammari DM, Almomen A, Taha M, Albahrna H, Alshammari S. Quality of life and morbidity after endoscopic endonasal skull base surgeries using the Sinonasal Outcomes Test (SNOT): a tertiary hospital experience. Int J Otolaryngol. (2021) 2021:6659221. doi: 10.1155/2021/6659221
13. De Jesus O. Complications after open skull base surgery for brain tumors: a 26-year experience. Cureus. (2023) 15(12):e50312. doi: 10.7759/cureus.50312
14. Lee SC, Senior BA. Endoscopic skull base surgery. Clin Exp Otorhinolaryngol. (2008) 1(2):53–62. doi: 10.3342/ceo.2008.1.2.53
15. Bagher Zadeh Ansari N, Léger É, Kersten-Oertel M. VentroAR: an augmented reality platform for ventriculostomy using the Microsoft Hololens. Comput Methods Biomech Biomed Eng Imaging Vis. (2023) 11(4):1225–33. doi: 10.1080/21681163.2022.2156394
16. El Chemaly T, Athayde Neves C, Leuze C, Hargreaves B, Blevins NH. Stereoscopic calibration for augmented reality visualization in microscopic surgery. Int J Comput Assist Radiol Surg. (2023) 18(11):2033–41. doi: 10.1007/s11548-023-02980-5
17. Gibby W, Cvetko S, Gibby A, Gibby C, Sorensen K, Andrews EG, et al. The application of augmented reality-based navigation for accurate target acquisition of deep brain sites: advances in neurosurgical guidance. J Neurosurg. (2022) 137(2):489–95. doi: 10.3171/2021.9.JNS21510
18. Goto Y, Kawaguchi A, Inoue Y, Nakamura Y, Oyama Y, Tomioka A, et al. Efficacy of a novel augmented reality navigation system using 3D computer graphic modeling in endoscopic transsphenoidal surgery for sellar and parasellar tumors. Cancers (Basel). (2023) 15(7):2148. doi: 10.3390/cancers15072148
19. Leuze C, Neves CA, Gomez AM, Navab N, Blevins N, Vaisbuch Y, et al. Augmented reality for retrosigmoid craniotomy planning. J Neurol Surg B Skull Base. (2022) 83(Suppl 2):e564–e73. doi: 10.1055/s-0041-1735509
20. Pojskic M, Bopp MHA, Sabeta B, Carl B, Nimsky C. Microscope-based augmented reality with intraoperative computed tomography-based navigation for resection of skull base meningiomas in consecutive series of 39 patients. Cancers (Basel). (2022) 14(9):2302. doi: 10.3390/cancers14092302
21. Steiert C, Behringer SP, Kraus LM, Bissolo M, Demerath T, Beck J, et al. Augmented reality-assisted craniofacial reconstruction in skull base lesions—an innovative technique for single-step resection and cranioplasty in neurosurgery. Neurosurg Rev. (2022) 45(4):2745–55. doi: 10.1007/s10143-022-01784-6
22. Lai M, Skyrman S, Shan C, Babic D, Homan R, Edstrom E, et al. Fusion of augmented reality imaging with the endoscopic view for endonasal skull base surgery; a novel application for surgical navigation based on intraoperative cone beam computed tomography and optical tracking. PLoS One. (2020) 15(1):e0227312. doi: 10.1371/journal.pone.0227312
23. Bong JH, Song HJ, Oh Y, Park N, Kim H, Park S. Endoscopic navigation system with extended field of view using augmented reality technology. Int J Med Robot. (2018) 14(2):e1886. doi: 10.1002/rcs.1886
24. Creighton FX, Unberath M, Song T, Zhao Z, Armand M, Carey J. Early feasibility studies of augmented reality navigation for lateral skull base surgery. Otol Neurotol. (2020) 41(7):883–8. doi: 10.1097/MAO.0000000000002724
25. Eom S, Ma TS, Vutakuri N, Hu T, Haskell-Mendoza AP, Sykes DAW, et al. Accuracy of routine external ventricular drain placement following a mixed reality-guided twist-drill craniostomy. Neurosurg Focus. (2024) 56(1):E11. doi: 10.3171/2023.10.FOCUS23615
26. Filimonov A, Zeiger J, Goldrich D, Nayak R, Govindaraj S, Bederson J, et al. Virtual reality surgical planning for endoscopic endonasal approaches to the craniovertebral junction. Am J Otolaryngol. (2022) 43(1):103219. doi: 10.1016/j.amjoto.2021.103219
27. Freysinger W, Gunkel AR, Thumfart WF. Image-guided endoscopic ENT surgery. Eur Arch Otorhinolaryngol. (1997) 254(7):343–6. doi: 10.1007/BF02630726
28. Li L, Yang J, Chu Y, Wu W, Xue J, Liang P, et al. A novel augmented reality navigation system for endoscopic sinus and skull base surgery: a feasibility study. PLoS One. (2016) 11(1):e0146996. doi: 10.1371/journal.pone.0146996
29. Liu Z, Yi Z. A new bony anatomical landmark for lateral skull base surgery. J Craniofac Surg. (2020) 31(4):1157–60. doi: 10.1097/SCS.0000000000006340
30. Marrone S, Scalia G, Strigari L, Ranganathan S, Travali M, Maugeri R, et al. Improving mixed-reality neuronavigation with blue-green light: a comparative multimodal laboratory study. Neurosurg Focus. (2024) 56(1):E7. doi: 10.3171/2023.10.FOCUS23598
31. McJunkin JL, Jiramongkolchai P, Chung W, Southworth M, Durakovic N, Buchman CA, et al. Development of a mixed reality platform for lateral skull base anatomy. Otol Neurotol. (2018) 39(10):e1137–e42. doi: 10.1097/MAO.0000000000001995
32. Paul P, Fleig O, Jannin P. Augmented virtuality based on stereoscopic reconstruction in multimodal image-guided neurosurgery: methods and performance evaluation. IEEE Trans Med Imaging. (2005) 24(11):1500–11. doi: 10.1109/TMI.2005.857029
33. Rosahl SK, Gharabaghi A, Hubbe U, Shahidi R, Samii M. Virtual reality augmentation in skull base surgery. Skull Base. (2006) 16(2):59–66. doi: 10.1055/s-2006-931620
34. Won TB, Hwang P, Lim JH, Cho SW, Paek SH, Losorelli S, et al. Early experience with a patient-specific virtual surgical simulation for rehearsal of endoscopic skull-base surgery. Int Forum Allergy Rhinol. (2018) 8(1):54–63. doi: 10.1002/alr.22037
35. Zeiger J, Costa A, Bederson J, Shrivastava RK, Iloreta AMC. Use of mixed reality visualization in endoscopic endonasal skull base surgery. Oper Neurosurg. (2020) 19(1):43–52. doi: 10.1093/ons/opz355
36. Mascitelli JR, Schlachter L, Chartrain AG, Oemke H, Gilligan J, Costa AB, et al. Navigation-linked heads-up display in intracranial surgery: early experience. Oper Neurosurg. (2018) 15(2):184–93. doi: 10.1093/ons/opx205
37. Umebayashi D, Yamamoto Y, Nakajima Y, Fukaya N, Hara M. Augmented reality visualization-guided microscopic spine surgery: transvertebral anterior cervical foraminotomy and posterior foraminotomy. J Am Acad Orthop Surg Glob Res Rev. (2018) 2(4):e008. doi: 10.5435/JAAOSGlobal-D-17-00008
38. Jean WC. Mini-Pterional craniotomy and extradural clinoidectomy for clinoid meningioma: optimization of exposure using augmented reality template: 2-dimensional operative video. Oper Neurosurg. (2020) 19(6):E610. doi: 10.1093/ons/opaa238
39. Van Gestel F, Frantz T, Vannerom C, Verhellen A, Gallagher AG, Elprama SA, et al. The effect of augmented reality on the accuracy and learning curve of external ventricular drain placement. Neurosurg Focus. (2021) 51(2):E8. doi: 10.3171/2021.5.FOCUS21215
40. Bopp MHA, Sass B, Pojskic M, Corr F, Grimm D, Kemmling A, et al. Use of neuronavigation and augmented reality in transsphenoidal pituitary adenoma surgery. J Clin Med. (2022) 11(19):5590. doi: 10.3390/jcm11195590
41. Davis MC, Can DD, Pindrik J, Rocque BG, Johnston JM. Virtual interactive presence in global surgical education: international collaboration through augmented reality. World Neurosurg. (2016) 86:103–11. doi: 10.1016/j.wneu.2015.08.053
42. Dominique G, Kunitsky K, Natchagande G, Jalloh M, Gebreamlak AL, Lawal I, et al. Evaluation of augmented reality technology in global urologic surgery. Am J Surg. (2023) 226(4):471–6. doi: 10.1016/j.amjsurg.2023.05.014
43. Vyas RM, Sayadi LR, Bendit D, Hamdan US. Using virtual augmented reality to remotely proctor overseas surgical outreach: building long-term international capacity and sustainability. Plast Reconstr Surg. (2020) 146(5):622e–9e. doi: 10.1097/PRS.0000000000007293
44. Schneider M, Kunz C, Pal'a A, Wirtz CR, Mathis-Ullrich F, Hlavac M. Augmented reality-assisted ventriculostomy. Neurosurg Focus. (2021) 50(1):E16. doi: 10.3171/2020.10.FOCUS20779
45. Wright T, de Ribaupierre S, Eagleson R. Design and evaluation of an augmented reality simulator using leap motion. Healthc Technol Lett. (2017) 4(5):210–5. doi: 10.1049/htl.2017.0070
46. Fernandes J, Teles A, Teixeira S. An augmented reality-based mobile application facilitates the learning about the spinal cord. Educ Sci. (2020) 10(12):376. doi: 10.3390/educsci10120376
47. Henssen D, van den Heuvel L, De Jong G, Vorstenbosch M, van Cappellen van Walsum AM, Van den Hurk MM, et al. Neuroanatomy learning: augmented reality vs. cross-sections. Anat Sci Educ. (2020) 13(3):353–65. doi: 10.1002/ase.1912
48. Ille S, Ohlerth AK, Colle D, Colle H, Dragoy O, Goodden J, et al. Augmented reality for the virtual dissection of white matter pathways. Acta Neurochir (Wien). (2021) 163(4):895–903. doi: 10.1007/s00701-020-04545-w
49. Mendez-Lopez M, Juan MC, Molla R, Fidalgo C. Evaluation of an augmented reality application for learning neuroanatomy in psychology. Anat Sci Educ. (2022) 15(3):535–51. doi: 10.1002/ase.2089
50. Pickering JD, Panagiotis A, Ntakakis G, Athanassiou A, Babatsikos E, Bamidis PD. Assessing the difference in learning gain between a mixed reality application and drawing screencasts in neuroanatomy. Anat Sci Educ. (2022) 15(3):628–35. doi: 10.1002/ase.2113
51. Jain S, Timofeev I, Kirollos RW, Helmy A. Use of mixed reality in neurosurgery training: a single centre experience. World Neurosurg. (2023) 176:e68–76. doi: 10.1016/j.wneu.2023.04.107
52. Silvero Isidre A, Friederichs H, Muther M, Gallus M, Stummer W, Holling M. Mixed reality as a teaching tool for medical students in neurosurgery. Medicina (Kaunas). (2023) 59(10):1720. doi: 10.3390/medicina59101720
53. Mishra R, Narayanan MDK, Umana GE, Montemurro N, Chaurasia B, Deora H. Virtual reality in neurosurgery: beyond neurosurgical planning. Int J Environ Res Public Health. (2022) 19(3):1719. doi: 10.3390/ijerph19031719
54. Munawar A, Li Z, Nagururu N, Trakimas D, Kazanzides P, Taylor RH, et al. Fully immersive virtual reality for skull-base surgery: surgical training and beyond. Int J Comput Assist Radiol Surg. (2024) 19(1):51–9. doi: 10.1007/s11548-023-02956-5
55. Arora A, Swords C, Khemani S, Awad Z, Darzi A, Singh A, et al. Virtual reality case-specific rehearsal in temporal bone surgery: a preliminary evaluation. Int J Surg. (2014) 12(2):141–5. doi: 10.1016/j.ijsu.2013.11.019
56. Locketz GD, Lui JT, Chan S, Salisbury K, Dort JC, Youngblood P, et al. Anatomy-specific virtual reality simulation in temporal bone dissection: perceived utility and impact on surgeon confidence. Otolaryngol Head Neck Surg. (2017) 156(6):1142–9. doi: 10.1177/0194599817691474
57. Fang TY, Wang PC, Liu CH, Su MC, Yeh SC. Evaluation of a haptics-based virtual reality temporal bone simulator for anatomy and surgery training. Comput Methods Programs Biomed. (2014) 113(2):674–81. doi: 10.1016/j.cmpb.2013.11.005
58. Francis HW, Malik MU, Diaz Voss Varela DA, Barffour MA, Chien WW, Carey JP, et al. Technical skills improve after practice on virtual-reality temporal bone simulator. Laryngoscope. (2012) 122(6):1385–91. doi: 10.1002/lary.22378
59. Nash R, Sykes R, Majithia A, Arora A, Singh A, Khemani S. Objective assessment of learning curves for the VOXEL-MAN TempoSurg temporal bone surgery computer simulator. J Laryngol Otol. (2012) 126(7):663–9. doi: 10.1017/S0022215112000734
60. Zhao YC, Kennedy G, Yukawa K, Pyman B, O'Leary S. Can virtual reality simulator be used as a training aid to improve cadaver temporal bone dissection? Results of a randomized blinded control trial. Laryngoscope. (2011) 121(4):831–7. doi: 10.1002/lary.21287
61. Shao X, Yuan Q, Qian D, Ye Z, Chen G, le Zhuang K, et al. Virtual reality technology for teaching neurosurgery of skull base tumor. BMC Med Educ. (2020) 20(1):3. doi: 10.1186/s12909-019-1911-5
62. Begagic E, Beculic H, Pugonja R, Memic Z, Balogun S, Dzidic-Krivic A, et al. Augmented reality integration in skull base neurosurgery: a systematic review. Medicina (Kaunas). (2024) 60(2):335. doi: 10.3390/medicina60020335
63. Cabrilo I, Sarrafzadeh A, Bijlenga P, Landis BN, Schaller K. Augmented reality-assisted skull base surgery. Neurochirurgie. (2014) 60(6):304–6. doi: 10.1016/j.neuchi.2014.07.001
64. Cai S, Zhou Y, Shen J, Guo J, Xiong X, Jiang X. Augmented reality based surgical training and education system for neurosurgery. 2022 International Conference on Advanced Robotics and Mechatronics (ICARM); 9–11 July 2022 (2022).
65. Carl B, Bopp M, Sass B, Pojskic M, Nimsky C. Augmented reality in intradural spinal tumor surgery. Acta Neurochir (Wien). (2019) 161(10):2181–93. doi: 10.1007/s00701-019-04005-0
66. Dolati P, Gokoglu A, Eichberg D, Zamani A, Golby A, Al-Mefty O. Multimodal navigated skull base tumor resection using image-based vascular and cranial nerve segmentation: a prospective pilot study. Surg Neurol Int. (2015) 6:172. doi: 10.4103/2152-7806.170023
67. Ragnhildstveit A, Li C, Zimmerman MH, Mamalakis M, Curry VN, Holle W, et al. Intra-operative applications of augmented reality in glioma surgery: a systematic review. Front Surg. (2023) 10:1245851. doi: 10.3389/fsurg.2023.1245851
68. Roethe AL, Rosler J, Misch M, Vajkoczy P, Picht T. Augmented reality visualization in brain lesions: a prospective randomized controlled evaluation of its potential and current limitations in navigated microneurosurgery. Acta Neurochir (Wien). (2022) 164(1):3–14. doi: 10.1007/s00701-021-05045-1
69. Schiavina R, Bianchi L, Lodi S, Cercenelli L, Chessa F, Bortolani B, et al. Real-time augmented reality three-dimensional guided robotic radical prostatectomy: preliminary experience and evaluation of the impact on surgical planning. Eur Urol Focus. (2021) 7(6):1260–7. doi: 10.1016/j.euf.2020.08.004
70. Shafarenko MS, Catapano J, Hofer SOP, Murphy BD. The role of augmented reality in the next phase of surgical education. Plast Reconstr Surg Glob Open. (2022) 10(11):e4656. doi: 10.1097/GOX.0000000000004656
71. Kia K, Hwang J, Kim IS, Ishak H, Kim JH. The effects of target size and error rate on the cognitive demand and stress during augmented reality interactions. Appl Ergon. (2021) 97:103502. doi: 10.1016/j.apergo.2021.103502
72. Parekh P, Patel S, Patel N, Shah M. Systematic review and meta-analysis of augmented reality in medicine, retail, and games. Vis Comput Ind Biomed Art. (2020) 3:21. doi: 10.1186/s42492-020-00057-7
73. Thompson S, Schneider C, Bosi M, Gurusamy K, Ourselin S, Davidson B, et al. In Vivo estimation of target registration errors during augmented reality laparoscopic surgery. Int J Comput Assist Radiol Surg. (2018) 13(6):865–74. doi: 10.1007/s11548-018-1761-3
74. Citardi MJ, Yao W, Luong A. Next-generation surgical navigation systems in sinus and skull base surgery. Otolaryngol Clin North Am. (2017) 50(3):617–32. doi: 10.1016/j.otc.2017.01.012
75. Birlo M, Edwards PJE, Clarkson M, Stoyanov D. Utility of optical see-through head mounted displays in augmented reality-assisted surgery: a systematic review. Med Image Anal. (2022) 77:102361. doi: 10.1016/j.media.2022.102361
76. Khor WS, Baker B, Amin K, Chan A, Patel K, Wong J. Augmented and virtual reality in surgery-the digital surgical environment: applications, limitations and legal pitfalls. Ann Transl Med. (2016) 4(23):454. doi: 10.21037/atm.2016.12.23
77. Plewan T, Mattig B, Kretschmer V, Rinkenauer G. Exploring the benefits and limitations of augmented reality for palletization. Appl Ergon. (2021) 90:103250. doi: 10.1016/j.apergo.2020.103250
78. Yoon JW, Chen RE, Kim EJ, Akinduro OO, Kerezoudis P, Han PK, et al. Augmented reality for the surgeon: systematic review. Int J Med Robot. (2018) 14(4):e1914. doi: 10.1002/rcs.1914
79. Alaraj A, Luciano CJ, Bailey DP, Elsenousi A, Roitberg BZ, Bernardo A, et al. Virtual reality cerebral aneurysm clipping simulation with real-time haptic feedback. Neurosurgery. (2015) 11(Suppl 2(2)):52–8. doi: 10.1227/NEU.0000000000000583
80. Dharmawardana N, Ruthenbeck G, Woods C, Elmiyeh B, Diment L, Ooi EH, et al. Validation of virtual-reality-based simulations for endoscopic sinus surgery. Clin Otolaryngol. (2015) 40(6):569–79. doi: 10.1111/coa.12414
81. Thawani JP, Ramayya AG, Abdullah KG, Hudgins E, Vaughan K, Piazza M, et al. Resident simulation training in endoscopic endonasal surgery utilizing haptic feedback technology. J Clin Neurosci. (2016) 34:112–6. doi: 10.1016/j.jocn.2016.05.036
82. Varshney R, Frenkiel S, Nguyen LH, Young M, Del Maestro R, Zeitouni A, et al. The McGill Simulator for Endoscopic Sinus Surgery (MSESS): a validation study. J Otolaryngol Head Neck Surg. (2014) 43(1):40. doi: 10.1186/s40463-014-0040-8
83. Hooten KG, Lister JR, Lombard G, Lizdas DE, Lampotang S, Rajon DA, et al. Mixed reality ventriculostomy simulation: experience in neurosurgical residency. Neurosurgery. (2014) 10(Suppl 4):576–81; discussion 81. doi: 10.1227/NEU.0000000000000503
84. Yeung JT, Taylor HM, Nicholas PJ, Young IM, Jiang I, Doyen S, et al. Using Quicktome for intracerebral surgery: early retrospective study and proof of concept. World Neurosurg. (2021) 154:e734–e42. doi: 10.1016/j.wneu.2021.07.127
85. Mazur-Hart DJ, Yaghi NK, Shahin MN, Raslan AM. Stealth Autoguide for robotic-assisted laser ablation for lesional epilepsy: illustrative case. J Neurosurg Case Lessons. (2022) 3(6):CASE21556. doi: 10.3171/CASE21556
86. Bronowicki K, Antoniuk-Majchrzak J, Malesza I, Mozarowski W, Szymborska A, Pachuta B, et al. An attempt to evaluate the use of mixed reality in surgically treated pediatric oncology patients. NPJ Digit Med. (2025) 8(1):262. doi: 10.1038/s41746-025-01638-7
87. Satava RM. Virtual reality surgical simulator. The first steps. Surg Endosc. (1993) 7(3):203–5. doi: 10.1007/BF00594110
88. Kockro RA, Serra L, Tseng-Tsai Y, Chan C, Yih-Yian S, Gim-Guan C, et al. Planning and simulation of neurosurgery in a virtual reality environment. Neurosurgery. (2000) 46(1):118–35; discussion 35–7. doi: 10.1093/neurosurgery/46.1.118
89. Stredney D, Wiet GJ, Bryan J, Sessanna D, Murakami J, Schmalbrock P, et al. Temporal bone dissection simulation–an update. Stud Health Technol Inform. (2002) 85:507–13.15458142
90. Oishi M, Fukuda M, Ishida G, Saito A, Hiraishi T, Fujii Y. Prediction of the microsurgical window for skull-base tumors by advanced three-dimensional multi-fusion volumetric imaging. Neurol Med Chir (Tokyo). (2011) 51(3):201–7. doi: 10.2176/nmc.51.201
91. Zawy Alsofy S, Nakamura M, Suleiman A, Sakellaropoulou I, Welzel Saravia H, Shalamberidze D, et al. Cerebral anatomy detection and surgical planning in patients with anterior skull base meningiomas using a virtual reality technique. J Clin Med. (2021) 10(4):681. doi: 10.3390/jcm10040681
92. Codd AM, Choudhury B. Virtual reality anatomy: is it comparable with traditional methods in the teaching of human forearm musculoskeletal anatomy? Anat Sci Educ. (2011) 4(3):119–25. doi: 10.1002/ase.214
93. Fried MP, Uribe JI, Sadoughi B. The role of virtual reality in surgical training in otorhinolaryngology. Curr Opin Otolaryngol Head Neck Surg. (2007) 15(3):163–9. doi: 10.1097/MOO.0b013e32814b0802
94. Hu A, Wilson T, Ladak H, Haase P, Doyle P, Fung K. Evaluation of a three-dimensional educational computer model of the larynx: voicing a new direction. J Otolaryngol Head Neck Surg. (2010) 39(3):315–22.20470679
95. Kucuk S, Kapakin S, Goktas Y. Learning anatomy via mobile augmented reality: effects on achievement and cognitive load. Anat Sci Educ. (2016) 9(5):411–21. doi: 10.1002/ase.1603
96. Nicholson DT, Chalk C, Funnell WR, Daniel SJ. Can virtual reality improve anatomy education? A randomised controlled study of a computer-generated three-dimensional anatomical ear model. Med Educ. (2006) 40(11):1081–7. doi: 10.1111/j.1365-2929.2006.02611.x
97. Stepan K, Zeiger J, Hanchuk S, Del Signore A, Shrivastava R, Govindaraj S, et al. Immersive virtual reality as a teaching tool for neuroanatomy. Int Forum Allergy Rhinol. (2017) 7(10):1006–13. doi: 10.1002/alr.21986
98. Lai C, Lui JT, de Lotbiniere-Bassett M, Chen JM, Lin VY, Agrawal SK, et al. Virtual reality simulation for the middle cranial fossa approach: a validation study. Oper Neurosurg. (2024) 26(1):78–85. doi: 10.1227/ons.0000000000000915
99. Gurses ME, Gungor A, Gokalp E, Hanalioglu S, Karatas Okumus SY, Tatar I, et al. Three-dimensional modeling and augmented and virtual reality simulations of the white matter anatomy of the cerebrum. Oper Neurosurg. (2022) 23(5):355–66. doi: 10.1227/ons.0000000000000361
100. Gurses ME, Gungor A, Rahmanov S, Gokalp E, Hanalioglu S, Berker M, et al. Three-dimensional modeling and augmented reality and virtual reality simulation of fiber dissection of the cerebellum and brainstem. Oper Neurosurg. (2022) 23(5):345–54. doi: 10.1227/ons.0000000000000358
101. Corvino S, Piazza A, Spiriev T, Tafuto R, Corrivetti F, Solari D, et al. The sellar region as seen from transcranial and endonasal perspectives: exploring bony landmarks through new surface photorealistic three-dimensional model reconstruction for neurosurgical anatomy training. World Neurosurg. (2024) 185:e367–75. doi: 10.1016/j.wneu.2024.02.022
102. Dadario NB, Quinoa T, Khatri D, Boockvar J, Langer D, D'Amico RS. Examining the benefits of extended reality in neurosurgery: a systematic review. J Clin Neurosci. (2021) 94:41–53. doi: 10.1016/j.jocn.2021.09.037
103. Bailey SKT, Brannick MT, Reiner CC, Rettig N, Dyer LM, Okuda Y, et al. Immersive distance simulation: exploring the educational impact of stereoscopic extended reality (XR) video in remote learning environments. Med Teach. (2024) 46(9):1134–6. doi: 10.1080/0142159X.2024.2314725
104. Khan Z, Adil T, Oduoye MO, Khan BS, Ayyazuddin M. Assessing the knowledge, attitude and perception of extended reality (XR) technology in Pakistan’s healthcare community in an era of artificial intelligence. Front Med (Lausanne). (2024) 11:1456017. doi: 10.3389/fmed.2024.1456017
105. Li X, Elnagar D, Song G, Ghannam R. Advancing medical education using virtual and augmented reality in low- and middle-income countries: a systematic and critical review. Virtual Worlds. (2024) 3(3):384–403. doi: 10.3390/virtualworlds3030021
106. Mondal R. Role of augmented reality and virtual reality from the Indian healthcare education perspective—a systematic review. J Family Med Prim Care. (2024) 13(8):2841–50. doi: 10.4103/jfmpc.jfmpc_368_24
107. Zuhair V, Babar A, Ali R, Oduoye MO, Noor Z, Chris K, et al. Exploring the impact of artificial intelligence on global health and enhancing healthcare in developing nations. J Prim Care Community Health. (2024) 15:21501319241245847. doi: 10.1177/21501319241245847
108. Trowman R, Migliore A, Ollendorf DA. Designing collaborations involving health technology assessment: discussions and recommendations from the 2024 Health Technology Assessment International Global Policy Forum. Int J Technol Assess Health Care. (2024) 40(1):e41. doi: 10.1017/S0266462324000436
109. Chen B, Macdonald S, Attallah M, Chapman P, Ghannam R. A Review of Prototyping in XR: Linking Extended Reality to Digital Fabrication (2025).
110. Mancilla D, Moczygemba J. Exploring medical identity theft. Perspect Health Inf Manag. (2009) 6(Fall):1e.20169017
111. Bui J, Lee D, Murthi S, Kantor T, Stegink C, Reddy RM. The role of extended reality in enhancing surgical training: a narrative review. Current Challenges in Thoracic Surgery. (2025) 7:10. doi: 10.21037/ccts-24-43
112. Le Noury P, Polman R, Maloney M, Gorman A. A narrative review of the current state of extended reality technology and how it can be utilised in sport. Sports Med. (2022) 52(7):1473–89. doi: 10.1007/s40279-022-01669-0
113. Abramovic A, Demetz M, Krigers A, Bauer M, Lener S, Pinggera D, et al. Surgeon’s comfort: the ergonomics of a robotic exoscope using a head-mounted display. Brain Spine. (2022) 2:100855. doi: 10.1016/j.bas.2021.100855
114. Demetz M, Abramovic A, Krigers A, Bauer M, Lener S, Pinggera D, et al. Cadaveric study of ergonomics and performance using a robotic exoscope with a head-mounted display in spine surgery. J Robot Surg. (2024) 18(1):6. doi: 10.1007/s11701-023-01777-7
115. Cossio S, Chiappinotto S, Dentice S, Moreal C, Magro G, Dussi G, et al. Cybersickness and discomfort from head-mounted displays delivering fully immersive virtual reality: a systematic review. Nurse Educ Pract. (2025) 85:104376. doi: 10.1016/j.nepr.2025.104376
116. Paudyal R, Shah AD, Akin O, Do RKG, Konar AS, Hatzoglou V, et al. Artificial intelligence in CT and MR imaging for oncological applications. Cancers (Basel). (2023) 15(9):2573. doi: 10.3390/cancers15092573
117. Bocanegra-Becerra JE, Acha Sanchez JL, Castilla-Encinas AM, Rios-Garcia W, Mendieta CD, Quiroz-Marcelo DA, et al. Toward a frontierless collaboration in neurosurgery: a systematic review of remote augmented and virtual reality technologies. World Neurosurg. (2024) 187:114–21. doi: 10.1016/j.wneu.2024.04.048
Keywords: artificial intelligence, augmented reality, extended reality, machine learning, mixed reality, neurosurgery, otolaryngology, skull base
Citation: Sanker V, Nordin EOR, Badary A, Eberle AT, Ramanan S, Jensen KN, Miller ME, Catapano JS, Huguenard AL, Rahmani R and Lawton MT (2025) Extended reality in skull base surgery: a systematic literature review. Front. Surg. 12:1642033. doi: 10.3389/fsurg.2025.1642033
Received: 9 June 2025; Revised: 22 October 2025;
Accepted: 21 November 2025;
Published: 15 December 2025.
Edited by:
M. Shahid Anwar, Gachon University, Republic of KoreaReviewed by:
Sahin Hanalioglu, Hacettepe University, TürkiyeAlessandro Boaro, University of Verona, Italy
Baylar Baylarov, Hacettepe University, Türkiye
Copyright: © 2025 Sanker, Nordin, Badary, Eberle, Ramanan, Jensen, Miller, Catapano, Huguenard, Rahmani and Lawton. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Michael T. Lawton, bmV1cm9wdWJAYmFycm93bmV1cm8ub3Jn
Amr Badary