Your new experience awaits. Try the new design now and help us make it even better

ORIGINAL RESEARCH article

Front. Oncol., 08 January 2026

Sec. Head and Neck Cancer

Volume 15 - 2025 | https://doi.org/10.3389/fonc.2025.1695870

This article is part of the Research TopicThe Application of 3D Printing Technologies in Head and Neck Cancer SurgeryView all 5 articles

Virtual reality and 3D printing in head and neck cancer: an educational experience

Fabrizio Ferretti*Fabrizio Ferretti1*Francesca NonisFrancesca Nonis2Andrea NovaresioAndrea Novaresio1Enrica PanicoEnrica Panico1Emanuele ZavatteroEmanuele Zavattero1Claudia BorbonClaudia Borbon1Sandro MoosSandro Moos2Enrico VezzettiEnrico Vezzetti2Massimo FasolisMassimo Fasolis1Guglielmo RamieriGuglielmo Ramieri1
  • 1Department of Surgical Sciences, Division of Maxillofacial Surgery, University of Turin, Città della Salute e della Scienza Hospital, Torino, Italy
  • 2Department of Management and Production Engineering, Politecnico di Torino, Torino, Italy

Background: In recent years, the growing adoption of Virtual Reality (VR) and 3D printing technologies has revolutionized surgical training by providing innovative opportunities for hands-on education. This study investigates the combined use of VR and 3D printed personalized anatomical models and cutting guides within the field of oral and maxillofacial oncologic surgery.

Materials and methods: A mandibular tumour case was developed using the proposed approach, integrating both virtual and physical tools. Feedback was gathered from twelve surgical residents regarding their understanding of the case, the effectiveness of the immersive and three-dimensional technologies, and their overall satisfaction with the training experience.

Results: Participants reported enhanced comprehension of complex surgical scenarios and valued the practical utility of the VR simulator combined with 3D printed models. The immersive environment facilitated skill acquisition in a risk-free setting.

Conclusion: The findings underscore the significant added value of integrating VR and 3D printing technologies in surgical training, preparation, and simulation. This approach offers a safe, effective training platform that improves readiness for complex procedures in oncologic surgery and has the potential to be extended to other branches of maxillofacial surgery.

Introduction

The intricate anatomy of the head and neck region and the complexity associated to surgical procedures provide fertile ground for the integration of advanced technologies. Among these, Virtual Reality (VR) is emerging as an effective and intuitive tool for surgical planning (1), while three-dimensional (3D) modelling and printing technologies allow surgeons to conduct detailed and independent preoperative planning (2). Together, these technologies enhance the understanding and visualization of pathology, improve assessment of operability, and support individualized treatment strategies.

VR enables the creation of immersive, interactive environments for exploring anatomical regions, offering valuable applications in diagnosis, surgical planning, and training. Its use extends across multiple domains, from dental implantology and orthognathic surgery to complex 3D planning for post-tumour mandibular reconstructions and fracture repair, improving precision and safety through predictive planning and intraoperative navigation. In addition, VR provides a powerful educational platform, allowing standardized yet flexible learning experiences where trainees can visualize and practice surgical procedures in a controlled and replicable way (3, 4).

Similarly, 3D printing has transformed modern healthcare through the development of patient-specific models and implants, representing a key step toward precision medicine. In oral and maxillofacial surgery, its most relevant applications include traumatology, orthognathic surgery, and joint reconstruction, where custom-made solutions have led to improved surgical accuracy and clinical outcomes (5). Numerous studies have underscored the importance of these technologies in mandibular reconstruction following tumour resection (6).

In oncologic surgery, surgical precision, from preoperative planning to intraoperative execution, is crucial. This accuracy depends on the surgeon’s ability to spatially perceive the lesion and its anatomical relationships. While conventional computed tomography (CT) provides essential diagnostic data, it offers only a two-dimensional representation, limiting three-dimensional comprehension.

This study aims to evaluate the application of VR and 3D printing in head and neck oncologic surgery as part of an educational and training experience for young maxillofacial residents. The initiative is designed with a didactic purpose, demonstrating and explaining the decision-making process involved in surgical design and planning. Using a VR headset, specifically the Meta Quest Pro, participants non-experts in the procedure planned a mandibular resection and reconstruction with fibula flap harvesting. The same procedure was carried out on 3D printed models. Finally, participants completed a questionnaire to assess their experience. From an educational standpoint, this pilot experience is grounded in experiential learning theory of Dale’s Cone of Experience (7), which supports the use of immersive, practice-oriented activities to improve comprehension and skill acquisition.

This scalable and extendable approach may serve as a model for future training programs in complex head and neck surgery.

Materials and methods

Before starting the simulation, participants received a brief orientation session on how to use the VR environment, and the 3D printed cutting guides and anatomical models. No specific theoretical discussion of the surgical procedure was provided at this stage. After completing both practical sessions, participants engaged in a debriefing with an expert surgeon, during which the underlying surgical planning principles were explained and discussed.

Data collection aimed to assess the feasibility, usability, and educational potential of the proposed workflow. Specifically, procedure duration, accuracy in tumour localization and resection planning, and correct placement of osteotomy guides were measured to estimate the overall practicality and usability of the experience.

Participants

Twelve maxillofacial surgery residents (ten female), evenly split between first-, second- and third-year trainees, participated in a hands-on educational session that combined a VR surgical simulator with 3D printed anatomical models of a mandibular malignant tumour. Upon entering the VR environment, participants were presented with an introductory panel that guided them through the different phases of the simulation. The same surgical workflow performed in VR was then replicated on stereolithographic models, which included a fibula, a mandible with the tumour in situ, and the corresponding cutting and repositioning guides for flap insertion.

For each participant, the following parameters were recorded: total time required to complete the exercise; the axial, sagittal, and coronal CT section plane best identifying the tumour; the CT angiography section localizing the distal osteotomy of the fibula; the chosen resection plane of the mandibular lesion; and the number of attempts required to correctly position and reposition both the cutting guides and the fibula flap. Prior exposure to immersive technologies was limited, with only four participants reporting previous experience, specifically half through recreational VR use and the other half through prior exposure to both VR and 3D printing in a medical setting.

VR environment

The VR simulator was developed as a training platform for junior residents (Figure 1) in maxillofacial surgery to introduce the core concepts and spatial understanding required for mandibular reconstruction using a free fibula flap. Rather than aiming to replicate the complexity of a complete surgical procedure, the simulator focused on essential steps in the oncologic and reconstructive workflow, presented in a structured, interactive format suitable for early-stage learners. The Meta Quest Pro headset, powered by the Qualcomm Snapdragon XR2+ platform with a resolution of 1080x1920 pixels per eye, was used in this educational experience. All interactions in the simulator were conducted using a hand-tracking system, allowing users to manipulate and explore virtual objects with natural hand gestures, without the use of physical controllers. During the procedure, an operator maintained visual control on a tablet viewer of the environment in which the participant was carrying out the experience.

Figure 1
Two individuals are wearing black virtual reality headsets in a room. The person on the left, wearing green scrubs, stands while the person on the right, in a white top, gestures with one hand.

Figure 1. Two participants wearing the Meta Quest Pro headset while interacting with the VR environment.

Upon entering the VR environment, users were first introduced to the anatomical and radiological data of the case. DICOM files of the mandible affected by a tumour were available in both 2D (axial, coronal, and sagittal planes) and as a fully interactable 3D model. In addition, DICOM images from an angio-CT scan of the lower limb were provided to allow evaluation of the fibula and its vascular anatomy (Figure 2).

Figure 2
Two medical imaging displays show DICOM scans. The left display features images of a leg in axial, coronal, and sagittal views, with navigation buttons. The right display shows scans of a jaw and skull in similar views, alongside navigation options.

Figure 2. CT scans in axial, coronal, and sagittal plans and 3D model of the mandible (left) and the fibula (right).

The first task required participants to identify tumour margins and define the appropriate osteotomy planes directly on the 3D model of the mandible. After confirming their choices, the system generated a real-time comparison with the expert-defined surgical plan (Figure 3), enabling trainees to assess the accuracy of their decisions and develop a deeper understanding of oncologic resection planning. In the following step, users interacted with a virtual 3D model of the fibula to choose one of three predefined positions for the cutting guide (Figure 4). While this does not reflect the complete flexibility of intraoperative decision-making, the availability of fixed options ensured consistency in training and allowed for standardized assessment across participants. The selected guide determined how the fibula segments would be harvested in the subsequent reconstruction phase. This immersive interface facilitated intuitive interaction, though some participants reported a learning curve, particularly during tasks requiring precise positioning. In the final part of the simulation, users virtually removed the tumour using the mandibular cutting guide and proceeded to reconstruct the defect using fibula segments (Figure 5). These segments were manually placed into the mandibular gap, following a predefined alignment intended to facilitate the task and ensure uniformity across trainees. This standardized setup emphasized the spatial reasoning involved in reconstructive planning while avoiding the technical barriers of complete surgical realism.

Figure 3
A virtual reality simulation displaying 3D jaw models. On the left, a hand interacts with a translucent panel showing a skeletal jaw structure. On the right, a screen displays cross-sectional images of the jaw, with options for navigation below.

Figure 3. Osteotomy planes positioning. Osteotomies plans of mandible tumour defined by a participant in VR (left), and real-time comparison with participant and expert-defined surgical plan of osteotomy planes (right).

Figure 4
A 3D-rendered model of a human leg is encased in a clear, flexible material, resembling a medical training aid. The skeletal structure, including the foot, is visible. A small blue component is placed next to the model on the table.

Figure 4. 3D model of the fibula for position of cutting guide.

Figure 5
A 3D model displays a jawbone structure fitted into a mechanical device, likely for dental or surgical purposes. The device includes various colored components: a blue frame, green and orange sections, and a pink rod. The setup is shown on a gray background.

Figure 5. Virtual reconstruction of the mandible with fibula segments.

3D printing

Exported DICOM from CT of a mandibular solid tumour and angio-CT of lower limbs data were segmented with Mimics Innovation Suite (Materialise). Slicing and printing were prepared with Preform (Formlabs), and models were printed on a Formlabs Form 3B (Formlabs Industries, Boston, USA) with a nozzle size of 0.4 mm, a slice thickness of 0.2 mm, and the printing transparent resin. Case was printed on a 1:1 scale.

Each participant interacted directly with the 3D printed models (Figure 6), following the key steps of the surgical simulation for mandibular reconstruction with a fibula free flap. After exploring the anatomical models, participants were asked to perform the guided osteotomies and to reposition the fibular segment within the mandibular defect using the printed cutting and repositioning guides. In addition, they were instructed to shape a custom titanium reconstruction plate to fit the defect, replicating the intraoperative workflow. Importantly, the 3D printed tasks mirrored the same surgical phases that participants had previously performed in the VR environment, ensuring methodological consistency between the virtual and physical simulations. For each resident, the total interaction time was recorded, along with the number of attempts required to correctly position the cutting guides and fibular flap. At the end of the session, participants were invited to provide structured feedback on their experience with the models and the simulation process.

Figure 6
Hands holding a translucent anatomical model, with a separate part next to it on a table. The model appears to represent a section of the human spine.

Figure 6. Participant interacting with 3D printed cutting guides and anatomical models.

Questionnaire

After the experimental session, participants completed a questionnaire, including the System Usability Scale (SUS) (8) and the Net Promoter Score (NPS) (9) to assess usability and user satisfaction. The SUS is a standardized 10-item questionnaire providing a global measure of perceived system usability, with scores above 68 considered acceptable and those above 80 indicating excellent usability (10). The NPS quantifies user satisfaction and willingness to recommend the system, ranging from –100 to +100. According to the creators of NPS, Bain & Company (11), scores above 0 are considered good, above 20 favourable, above 50 excellent, and above 80 world-class. It is calculated using the following formula:

NPS=(number of promotersnumber of detractorstotal number of respondents)×100= (8012)×100=67

where promoters are respondents who gave a score of 9 or 10, and detractors gave a score from 0 to 6 on the 0–10 likelihood-to-recommend scale.

Group comparisons for SUS and NPS were performed using the Kruskal–Wallis, a non-parametric method chosen due to the small sample size to determine whether the medians of the three groups (1st, 2nd, and 3rd year) differ. To further explore the relationship between SUS and NPS scores, correlation was evaluated using Spearman’s rank correlation coefficient (r). Statistical significance was set at p < 0.05 (α).

Results

The mean time to perform the procedure in VR was 16.10 minutes, compared to 11.50 minutes on 3D-printed models. When stratified by training year, first-year residents took slightly longer on average (18.25 min) than second-year (16.40 min) and third-year residents (13.60 min), although differences were not statistically significant.

Tumour localization varied among participants, with differences observed across training years. Three participants correctly identified the lesion in the same axial section and five in the same sagittal section, while coronal sections differed. Notably, third-year residents more consistently identified the mandibular resection plane within oncological safe margins, whereas first- and second-year residents showed greater variability. Four participants accurately positioned the proximal resection plane. Cutting guides were successfully positioned on the fibula after three attempts and on the mandible on the first attempt. The fibula segment was correctly set within the mandible reconstruction on the first attempt.

Regarding usability evaluation and satisfaction, the average SUS score across all participants was 74.6, indicating good usability. The mean NPS score was 67, indicating strong user satisfaction and willingness to recommend the experience. Indeed, most participants were promoters (8) rather than detractors (0), reflecting an above-average level of acceptance and indicating a strong positive perception of the VR system. SUS and NPS scores are reported in Table 1.

Table 1
www.frontiersin.org

Table 1. System Usability Scale (SUS) and Net Promoter Score (NPS) reported for each participant.

No significant differences were observed among years for SUS (p=0.166) or NPS (p=0.964). Although the results did not reach statistical significance, a progressive increase in mean SUS scores from first- to third-year residents was observed, as shown in Figure 7 (left). In contrast, NPS values remained relatively stable across years, indicating consistent satisfaction levels regardless of training stage (Figure 7, right).

Figure 7
Two boxplots compare data sets. The left plot shows “SUS_1”, “SUS_2”, and “SUS_3” with median values around 70 to 80. The right plot shows “NPS_1”, “NPS_2”, and “NPS_3” with medians near 9.0 to 10.0.

Figure 7. Boxplots showing the distribution of System Usability Scale (SUS) scores (left) and Net Promoter Score (NPS) values (right) across training years (first, second, and third). Mean SUS values show a progressive increase from first- to third-year residents, suggesting an improvement in perceived usability with growing surgical experience, while NPS scores remain relatively stable across groups, indicating consistently high satisfaction levels.

To further explore the relationship between SUS and NPS scores, a Spearman correlation analysis (Figure 8) was computed. A moderate positive correlation was observed (r = 0.425), suggesting that participants who perceived higher usability tended to express higher willingness to recommend the VR system. However, the correlation did not reach statistical significance (p = 0.168), likely due to the small sample size. Visual inspection of the correlation plot shows a general upward trend, consistent with this moderate relationship, though with some dispersion reflecting individual differences in user perception and engagement.

Figure 8
Scatter plot titled “Matrix Plot of NPS; SUS” showing a positive Spearman correlation between NPS (Net Promoter Score) and SUS (System Usability Scale) with a correlation coefficient of 0.425 and a p-value of 0.168. Points are scattered, and a red line indicates the trend.

Figure 8. Scatterplot showing the correlation between System Usability Scale (SUS) and Net Promoter Score (NPS) across all participants. A moderate positive association was observed (Spearman’s r = 0.425, p = 0.168), indicating that higher perceived usability tended to correspond with greater user satisfaction and willingness to recommend the VR system.

Item-level analysis of the SUS further illuminated participant perceptions (Table 2). The statement “I think I would need the support of a technical person to be able to use this VR simulator” received the lowest score (12 out of 48), suggesting that several participants felt they would require assistance to operate the simulator independently, especially among first-year residents. Conversely, the highest-rated item was “I think I would like to use this system frequently” (42 out of 48), reflecting strong engagement and interest in continued use. All the other SUS items scored between 33 and 41, indicating a generally positive perception of usability. In particular, the SUS question “I thought the VR simulator was easy to use” scored relatively low (33 out of 48), ranking as the second-lowest item.

Table 2
www.frontiersin.org

Table 2. System Usability Scale (SUS) questionnaire, reporting the sum and average for each item.

Beyond usability metrics, all participants rated their overall satisfaction with the VR experience at the maximum score (5 out of 5) on a Likert scale. They also acknowledged the educational value of integrating VR and 3D printing for oncologic and reconstructive maxillofacial surgery and identified potential applications in other subspecialties, including orthognathic surgery (10 participants), aesthetic/malformation surgery (3), and traumatology (7).

Discussion

This study explored the combined use of VR and 3D printed anatomical models as educational tool for young residents in maxillofacial surgery, focusing on mandibular oncologic reconstruction. The results demonstrate that immersive VR, complemented by physical 3D printed guides, provides a safe, engaging, and reproducible training environment that was well received by participants, including those with limited prior exposure to digital planning technologies. The VR environment developed for this study will be made available upon request to encourage replication and further research.

Dalgarno and Lee (12) have identified several key factors that make VR effective as an educational tool, including representational fidelity, immediacy of control, and a sense of presence. Immersion refers to the objective characteristics of the virtual environment, whereas presence combines these objective aspects with the user’s subjective engagement in the virtual space. A responsive and realistic VR system, therefore, can reproduce the feeling of “being there,” even in simulated or remote environments.

Computer-aided design (CAD) and 3D printing techniques began to be adopted in the 1990s for the diagnosis and treatment of complex head and neck and maxillofacial pathologies. Literature reports that these technologies can improve diagnostic accuracy by nearly 30%, procedural precision by more than 35%, and reduce operative time by about 18% (13). Nowadays, 3D printing plays a key role in reconstructive surgery, especially in mandibular reconstruction (14, 15). Previous studies have demonstrated the utility of 3D printing for preoperative rehearsal, anatomical study, and production of patient-specific surgical guides (16, 17). Zavattero et al. (18) also highlighted the feasibility and cost-effectiveness of in-house 3D printing for both clinical and educational applications, enabling customized planning without relying solely on commercial providers.

In our study, usability and user satisfaction were rated positively, with a mean System Usability Scale (SUS) score of 74.6, consistent with good usability. The high Net Promoter Score (NPS) of 67 reflects strong user satisfaction and willingness to recommend the experience, highlighting its perceived value for surgical training. All participants reported maximum satisfaction (5/5) on a Likert scale, reinforcing the educational benefit of the platform despite some operational challenges. When compared with previous studies adopting VR within the medical field, our mean SUS score of 74.6 falls within the expected range for early-stage educational VR systems. Bhat et al. (19) designed a VR framework for interactive and immersive visualization of CBCT data to enhance the understanding of complex dental structures and pre-operative planning in dentistry. Their evaluation involving 12 medical experts demonstrated excellent usability, with a SUS score of 87. Similarly, Haque et al. (20) developed a VR gamified system to support hand exercises for stroke rehabilitation, reporting a SUS score of 85 among patients, relatives, and physiotherapists. In contrast, Hsiesh et al. (21) developed an immersive VR mirrored-hand system for upper-limb rehabilitation and reported a mean SUS score of 56.7, interpreted as only moderately acceptable usability and highlighting the need for improved user guidance.

In this context, our findings align well with the broader literature: the overall SUS score of 74.6 indicates good usability, consistent with effective and user-friendly VR applications, while the third-year residents achieved an average SUS of 80, reaching the threshold of excellent usability (10). Although statistical comparisons among training years did not reach significance, a progressive increase in SUS scores from first- to third-year residents was observed, suggesting that greater clinical experience and familiarity with digital planning tools may facilitate interaction with the VR environment. NPS values remained stable across training years, indicating consistent satisfaction levels regardless of experience. The moderate, though non-significant, positive correlation between SUS and NPS further suggests that higher perceived usability was generally associated with greater willingness to recommend the system. These findings reinforce that, while the system was well received overall, enhanced interface intuitiveness and user guidance could further optimize accessibility for less experienced users.

Item-level analysis of the SUS revealed areas for refinement and improvement. While most participants expressed enthusiasm for frequent use of the system, several indicated they would require technical support to operate it independently, especially first-year residents. The relatively low ratings for ease of use suggest that interface design and onboarding warrant further improvement, particularly for hand-tracking tasks such as positioning cutting guides on the mandible and fibula. These findings echo prior work showing that intuitive interactivity is essential for maintaining immersion and maximizing educational benefits in VR-based learning environments. Beyond usability, participants recognized the strong educational value of VR and 3D printing for oncological and reconstructive surgery, but also identified potential applications in orthognathic surgery, traumatology, and aesthetic or malformation surgery, underscoring the broader versatility of these technologies. This perception is consistent with literature describing the expanding role of VR and 3D printing in head and neck surgery, not only for preoperative planning but also for intraoperative guidance, patient communication, and multidisciplinary education (4).

From a pedagogical standpoint, this study aligns with experiential learning principles (7), which emphasizes that active, hands-on experiences lead to deeper understanding and longer retention compared with passive learning. By combining immersive visualization with tangible manipulation of anatomical models, VR and 3D printing bridge the gap between theoretical study and surgical practice. A recent review by Tene et al. (22) identified VR as a prevalent tool in medical education, highlighting a positive trend toward improved educational outcomes but also stressing the need for more rigorous evidence of efficacy. The findings of the present study support this direction, demonstrating the feasibility, safety, and strong acceptance of immersive technologies in maxillofacial surgical training.

Nevertheless, some limitations must be acknowledged. First, this was a single-center pilot study involving a limited number of participants. Initially, eight residents from the first and second years of the maxillofacial surgery program were enrolled, and the cohort was later expanded to include third-year residents, for a total of twelve participants. Although this number reflects the entire group available within the institution, it inevitably restricts statistical power and generalizability. Future multicentric studies including larger samples across different levels of surgical experience will be necessary to validate these findings.

Second, no control group was included. This choice was intentional, as the aim of the study was not to demonstrate superiority over conventional teaching methods, but rather to assess the feasibility, usability, and perceived educational value of VR and 3D printing as complementary learning tools. Traditional cadaveric dissection (23) remains the reference standard for surgical education; however, its use is limited by high costs, ethical considerations, and restricted accessibility. In this context, virtual and 3D printed simulations offer reproducible, cost-effective, and low-risk alternatives that can supplement rather than replace traditional training.

Third, the study primarily relied on subjective assessments of usability and satisfaction. Although these data provide valuable insight into user experience, objective performance metrics were limited. The time required to complete the procedure was recorded as a preliminary indicator of task efficiency, but future studies should incorporate additional quantitative measures, such as osteotomy accuracy, margin identification error, and flap alignment deviation, to better evaluate the educational and technical impact of these tools.

Another important limitation concerns model validation. The VR and 3D printed models were derived from clinical CT and CTA datasets used for actual patient-specific planning, ensuring high anatomical fidelity. However, a formal validation comparing these digital and physical models with intraoperative or cadaveric anatomical data was not performed. Such validation would be essential in future research to confirm dimensional precision and strengthen confidence in these educational simulators.

Finally, the development of a realistic and interactive VR environment required significant collaboration between clinicians and engineers. Although this initial phase was time- and resource-intensive, the resulting modules are reproducible and scalable, allowing for future integration into broader training curricula at a relatively low incremental cost.

In conclusion, this pilot study supports the integration of VR and 3D printing into surgical education as a means to enhance learning experiences and engagement among residents. Despite the small cohort and single-institution design, participants reported high satisfaction and recognized these technologies as valuable complements to traditional training. With further refinement, objective validation, and multicentre collaboration, immersive and 3D technologies hold significant potential to improve preparedness for complex procedures, optimize surgical planning, and ultimately contribute to better surgical outcomes.

Data availability statement

The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.

Ethics statement

Ethical approval was not required for the studies involving humans in accordance with the local legislation and institutional requirements. The participants provided their written informed consent to participate in this study. Written informed consent was obtained from the individual(s) for the publication of any potentially identifiable images or data included in this article.

Author contributions

FF: Conceptualization, Data curation, Funding acquisition, Investigation, Methodology, Project administration, Resources, Supervision, Validation, Visualization, Writing – original draft, Writing – review & editing. FN: Conceptualization, Data curation, Formal Analysis, Investigation, Methodology, Software, Supervision, Validation, Writing – original draft, Writing – review & editing. AN: Conceptualization, Methodology, Project administration, Software, Supervision, Validation, Visualization, Writing – review & editing. EP: Validation, Writing – original draft. EZ: Supervision, Validation, Visualization, Writing – review & editing. CB: Supervision, Validation, Visualization, Writing – review & editing. SM: Conceptualization, Investigation, Methodology, Supervision, Validation, Visualization, Writing – review & editing. EV: Supervision, Writing – review & editing. MF: Conceptualization, Investigation, Methodology, Supervision, Validation, Visualization, Writing – review & editing. GR: Conceptualization, Methodology, Supervision, Validation, Visualization, Writing – review & editing.

Funding

The author(s) declared that financial support was not received for this work and/or its publication.

Acknowledgments

The authors would like to thank Fabio Roccia for his assistance and support.

Conflict of interest

The authors declared that this work was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Generative AI statement

The author(s) declared that generative AI was not used in the creation of this manuscript.

Any alternative text (alt text) provided alongside figures in this article has been generated by Frontiers with the support of artificial intelligence and reasonable efforts have been made to ensure accuracy, including review by the authors wherever possible. If you identify any issues, please contact us.

Publisher’s note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

References

1. Arias-Amezquita E, Alkureishi L, Purnell C, Zhao L, Lee O, Mathis SA, et al. Virtual reality planning in reconstructive surgery for orbital prosthetic rehabilitation using ImmersiveTouch platform: preliminary report. J Craniofac Surg. (2024) 35:392–7. doi: 10.1097/SCS.0000000000009794

PubMed Abstract | Crossref Full Text | Google Scholar

2. Ayoub A and Pulijala Y. The application of virtual reality and augmented reality in Oral & Maxillofacial Surgery. BMC Oral Health. (2019) 19:1–8. doi: 10.1186/s12903-019-0937-8

PubMed Abstract | Crossref Full Text | Google Scholar

3. Ganguli A, Pagan-Diaz GJ, Grant L, Cvetkovic C, Bramlet M, Vozenilek J, et al. 3D printing for preoperative planning and surgical training: a review. BioMed Microdevices. (2018) 20:65. doi: 10.1007/s10544-018-0301-9

PubMed Abstract | Crossref Full Text | Google Scholar

4. Borbon C, Novaresio A, Iocca O, Nonis F, Moos S, Vezzetti E, et al. Evaluating osteotomy accuracy in mandibular reconstruction: A preliminary study using custom cutting guides and virtual reality. Diseases. (2025) 13(3), 81. doi: 10.3390/diseases13030081

PubMed Abstract | Crossref Full Text | Google Scholar

5. Ganry L, Hersant B, Bosc R, Leyder P, Quilichini J, and Meningaud JP. Study of medical education in 3D surgical modeling by surgeons with free open-source software: Example of mandibular reconstruction with fibula free flap and creation of its surgical guides. J Stomatol Oral Maxillofac Surg. (2018) 119:262–7. doi: 10.1016/j.jormas.2018.02.012

PubMed Abstract | Crossref Full Text | Google Scholar

6. Zoabi A, Redenski I, Oren D, Kasem A, Zigron A, Daoud S, et al. 3D printing and virtual surgical planning in oral and maxillofacial surgery. J Clin Med. (2022) 11:2385. doi: 10.3390/jcm11092385

PubMed Abstract | Crossref Full Text | Google Scholar

7. Dale E. Audiovisual Methods in Teaching. 3rd ed. Holt: Rinehart and Winston, Inc (1969).

Google Scholar

8. Lewis JR. The system usability scale: past, present, and future. Int J Hum–Comput Interact. (2018) 34:577–90. doi: 10.1080/10447318.2018.1455307

Crossref Full Text | Google Scholar

9. Reichheld F. The Ultimate Question 2.0 (Revised and Expanded Edition): How Net Promoter Companies Thrive in a Customer-Driven World. Harvard Business Review Press (2011). p. 304.

Google Scholar

10. Bangor A, Kortum P, and Miller J. Determining what individual SUS scores mean: adding an adjective rating scale. Journal of usability studies (2009) 4(3):114–123.

Google Scholar

11. Measuring your net Promoter ScoreSM . Bain. Available online at: https://www.netpromotersystem.com/about/measuring-your-net-promoter-score/ (Accessed October 23, 2025).

Google Scholar

12. Dalgarno B and Lee MJW. What are the learning affordances of 3-D virtual environments? Brit J Educ Tech. (2010) 41:10–32. doi: 10.1111/j.1467-8535.2009.01038.x

Crossref Full Text | Google Scholar

13. Gateno J, Allen ME, Teichgraeber JF, and Messersmith ML. An in vitro study of the accuracy of a new protocol for planning distraction osteogenesis of the mandible. J Oral Maxillofac Surg. (2000) 58:985–90. doi: 10.1053/joms.2000.8740

PubMed Abstract | Crossref Full Text | Google Scholar

14. Nuseir A, Hatamleh MM, Alnazzawi A, Al-Rabab’ah M, Kamel B, and Jaradat E. Direct 3D printing of flexible nasal prosthesis: optimized digital workflow from scan to fit. J Prosthodont. (2019) 28:10–4. doi: 10.1111/jopr.13001

PubMed Abstract | Crossref Full Text | Google Scholar

15. Hatamleh MM, Ong J, Hatamleh ZM, Watson J, and Huppa C. Developing an in-house interdisciplinary three-dimensional service: challenges, benefits, and innovative health care solutions. J Craniofac Surg. (2018) 29:1870. doi: 10.1097/SCS.0000000000004743

PubMed Abstract | Crossref Full Text | Google Scholar

16. Cunningham LL, Madsen MJ, and Peterson G. Stereolithographic modeling technology applied to tumor resection. J Oral Maxillofac Surg. (2005) 63:873–8. doi: 10.1016/j.joms.2005.02.027

PubMed Abstract | Crossref Full Text | Google Scholar

17. Robiony M, Salvo I, Costa F, Zerman N, Bandera C, Filippi S, et al. Accuracy of virtual reality and stereolithographic models in maxillo-facial surgical planning. J Craniofac Surg. (2008) 19:482. doi: 10.1097/SCS.0b013e31814fb5c1

PubMed Abstract | Crossref Full Text | Google Scholar

18. Zavattero E, Fasolis M, Novaresio A, Gerbino G, Borbon C, and Ramieri G. The shape of things to come: in-hospital three-dimensional printing for mandibular reconstruction using fibula free flap. Laryngoscope. (2020) 130:E811–6. doi: 10.1002/lary.28650

PubMed Abstract | Crossref Full Text | Google Scholar

19. Bhat SH, Hareesha K, Kamath AT, Kudva A, Vineetha R, and Nair A. A framework to enhance the experience of CBCT data in real-time using immersive virtual reality: impacting dental pre-surgical planning. IEEE Access. (2024) 12:45442–55. doi: 10.1109/ACCESS.2024.3375770

Crossref Full Text | Google Scholar

20. Haque MDS, Zhang I, Ferwerda B, Porras J, Rahman SMM, Rahman AMM, et al. Immersive Virtual Reality to Support Hand Exercises for Stroke Rehabilitation: A Design Science Research Approach. In: Wiafe I, Babiker A, Ham J, Oyibo K, and Vlahu-Gjorgievska E, editors. Persuasive Technology. PERSUASIVE 2025 Satellite Events. Springer Nature Switzerland, Cham (2026). p. 3–17. doi: 10.1007/978-3-031-97177-8_1

Crossref Full Text | Google Scholar

21. Hsieh Y-W, Howe T-H, Lee M-T, Tai R-Y, and Chen C-C. Design and usability evaluation of an immersive virtual reality mirrored hand system for upper limb stroke rehabilitation. Sci Rep. (2025) 15:5781. doi: 10.1038/s41598-025-90698-6

PubMed Abstract | Crossref Full Text | Google Scholar

22. Tene T, Vique López DF, Valverde Aguirre PE, Orna Puente LM, and Vacacela Gomez C. Virtual reality and augmented reality in medical education: an umbrella review. Front Digit Health. (2024) 6:1365345. doi: 10.3389/fdgth.2024.1365345

PubMed Abstract | Crossref Full Text | Google Scholar

23. Ghosh SK. Cadaveric dissection as an educational tool for anatomical sciences in the 21st century. Anat Sci Educ. (2017) 10:286–99. doi: 10.1002/ase.1649

PubMed Abstract | Crossref Full Text | Google Scholar

Keywords: maxillofacial surgery, virtual reality, three-dimensional printing, mandibular reconstruction, fibula free flap, surgical education, user-centered design

Citation: Ferretti F, Nonis F, Novaresio A, Panico E, Zavattero E, Borbon C, Moos S, Vezzetti E, Fasolis M and Ramieri G (2026) Virtual reality and 3D printing in head and neck cancer: an educational experience. Front. Oncol. 15:1695870. doi: 10.3389/fonc.2025.1695870

Received: 30 August 2025; Accepted: 12 December 2025; Revised: 27 October 2025;
Published: 08 January 2026.

Edited by:

Manuel Tousidonis, Gregorio Marañón Hospital, Spain

Reviewed by:

Sonia Tewani Orcutt, University of Arkansas for Medical Sciences, United States
Wang-yong Zhu, The University of Hong Kong, China

Copyright © 2026 Ferretti, Nonis, Novaresio, Panico, Zavattero, Borbon, Moos, Vezzetti, Fasolis and Ramieri. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Fabrizio Ferretti, ZmFicml6aW8uZmVycmV0dGlAdW5pdG8uaXQ=

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.