- 1The 1st Ward of Joint Surgery, Joint Surgery Department, Tianjin Hospital, Tianjin, China
- 2Orthropedic Department, Tianjin Hexi Hospital, Tianjin, China
- 3College of Artificial Intelligence and Data Science, Hebei University of Technology, Tianjin, China
- 4Beijing Geriatric Healthcare and Disease Prevention Center, Xuanwu Hospital, Capital Medical University, Beijing, China
With advancements in computer vision, artificial intelligence, and other cutting-edge science and technologies, the focus of modern surgical technology has increasingly shifted towards intelligent, digital, minimally invasive, and precision approaches. Augmented reality (AR) technology and surgical robotics have emerged as significant research areas in total hip and knee replacement. Navigation systems, which are pivotal in both AR and robotic surgery, play a crucial role in guiding surgical operations using shared techniques. Recent developments in navigation systems for hip and knee replacement have focused on more natural, intelligent, and efficient methodologies. The use of AR and surgical robots for navigation has significantly enhanced the safety and accuracy of these procedures. Importantly, these technologies eliminate the need to implant positioning screws or other reference objects into the bone structure, thereby markedly reducing the risk of severe complications, such as lower limb pain and fractures. This study reviews the current applications, main challenges, and solutions associated with AR and surgical robot navigation systems for total hip and knee replacements.
1 Introduction
With the rapid advancement of artificial intelligence, precision instruments, computer vision, and other sophisticated technologies, robot-assisted surgery offers significant benefits, including minimal trauma, reduced risk of postoperative infections, and faster recovery. Currently, this technology is extensively applied in total hip and knee replacements (1–3). The primary advantage of surgical robots is their ability to combine the precision and dexterity of machines with the expertise of surgeons (4). The surgical robotic system incorporates multidisciplinary technologies, providing surgeons with capabilities for surgical planning, positioning, measurement, and enhanced visualization. These functions significantly improve the accuracy, safety, and reproducibility of surgical operations (5–8). Moreover, as medical imaging and computer technologies continue to advance, the concept of robot-assisted surgery has gained acceptance and recognition in the medical community. This concept encompasses various critical components, such as virtual surgical systems and surgical navigation systems (9). The fluency and precision of the surgical robot navigation system are crucial; for instance, the system involves the preoperative measurement of 3D data and intraoperative real-time registration of anatomical structures. Using an optical positioning device at the robot's end, it visualizes the patient's operative field, surgical instruments, and surgeon's posture in real-time using the visualization software, enabling precise navigation (10).
Augmented reality (AR) supplements real-world environments with computer-generated elements such as sound, video, and images, creating an enhanced sense of reality (11). This technology significantly improves the perception of the surgical environment, particularly for minimally invasive procedures. It not only helps doctors accurately position targets and improve patients' postoperative rehabilitation outcomes and quality of life but also provides a rich visualization of anatomical structures. This expansion of surgical visibility simplifies complex operations (12). In surgical applications, AR offers several main advantages: ① It seamlessly integrates real and virtual scenes, displaying them in real-time within the surgeon's field of view, which enhances the intuitiveness and locality of visual information. The system serves as an intraoperative navigational aid; ② A three-dimensional virtual model, reconstructed from two-dimensional tomographic imaging, compensates for limitations such as restricted visual space, poor image quality, and lack of depth perception; ③ When preoperative planning is conducted using a 3D virtual model, more precise and detailed surgical routes and areas can be designed. This approach also vividly portrays the spatial relationships among human tissues, surgical paths, and areas, reducing dependence on the surgeon's experience and imagination (13). During surgery, the AR surgical navigation (ARSN) system visually displays a translucent three-dimensional model of the target organ within the surgical field, aiding the surgeon in understanding the anatomy of the area (14). The application of AR technology in the surgical navigation of total hip and knee replacements is not constrained by factors such as cost, space, or equipment maintenance, thereby making clinical deployment more feasible.
In robotic surgical systems, the integration of AR technology enhances the system by compensating for reduced sensory perception, simplifying the surgical process, and enhancing both the safety and accuracy of operations (11). Currently, the surgical robot is equipped with a comprehensive human-computer interaction system that adequately supports the software and hardware requirements of the augmented reality navigation system through its own hardware and visualization capabilities. The incorporation of an augmented reality navigation system in robotic surgery not only facilitates easier operation but also offers greater cost-effectiveness (15). This paper reviews the current use of AR and surgical robot navigation systems for total hip and knee replacement, addressing the major challenges and solutions related to the deployment of ARSN systems in these procedures.
2 Augmented surgical navigation system
2.1 Navigation system based on touch control equipment
The touch control device operates in the manual input mode, requiring direct contact during use. Navigation systems that utilize touch control are exemplified by hand touch screens, which can function both as input devices to capture surgeons' movements and output devices to provide tactile feedback (16). The leading touch interaction technologies include 3D tactile feedback touchscreens and touch-sense tactile feedback technology (17–19). However, touch-based navigation systems present significant challenges in clinical surgery. Surgeons must often move away from the operation site to interact with the touchscreen, which disrupts the surgical workflow. This movement can lead to issues such as hand-eye coordination disruption and can compromise the surgical procedure. Consequently, AR navigation systems that rely on touch devices are seldom used in surgical settings.
2.2 Navigation system based on voice interaction
Speech recognition technology has evolved into a robust human-computer interaction method within AR systems, making voice the preferred mode of interaction in scenarios where traditional modes (such as hardware devices or touch control) are impractical (20). In AR settings, direct voice or voice-assisted interactions are possible. The focus is on the voice recognition engine, with major engines in the market including Speech API, Via Voice, and iFlytek (21). Voice interaction navigation systems in surgery offer several advantages: ① Voice interaction represents a novel, more convenient, and natural form of interactive control; ② As a close-range, non-contact type of control, it allows physicians to perform all control actions from a distance, thereby reducing contamination risks; ③ Recent studies have shown that integrating preset voice interaction commands can seamlessly connect the visualization system of surgery with the motion control system of the surgical robot. This integration enhances the cohesion of the robotic surgical system, simplifies the management of multiple subsystems, and ensures synchronization between visual information and robot motion (22–24).
2.3 Navigation system based on somatosensory interaction
A navigation system based on motion-sensing interaction technology utilizes optical capture principles and sensors or computer vision equipment to facilitate intraoperative navigation via the surgeon's gestures, eye movements, and other physical activities (25). For example, Ortega et al. used a micro-optical head-mounted display to show intraoperative perspective images in spinal fracture internal fixation surgeries. This system allows surgeons to use gestures to enlarge, shrink, and rotate images, thereby aiding in understanding the characteristics of the surgical site. Clinical trials involving 50 cases demonstrated that the micro-optical system significantly reduced the time surgeons spent looking at the image displays (26). Real-time position information of the identified surgical instruments is transmitted to the HoloLens device, enabling synchronous display of medical images, such as surgical instruments and lesion models. This provides visual displays of lesion images for doctors during surgery and facilitates lesion positioning and surgical instrument navigation through gesture interactions (27).
Owing to their convenience and robust interaction, somatosensory-interaction-based surgical navigation systems are becoming the primary development direction for total hip and knee replacement augmented reality navigation systems. Fotouhi et al. (2018) proposed an AR solution for positioning the acetabular cup in total hip arthroplasty. This system plans the placement of an acetabular cup using two-dimensional x-ray images taken after acetabular reaming combined with three-dimensional AR visualization technology. It aligns point-cloud data with the planned path to accurately position the cup (28). However, the study noted that the initial positioning of the pelvis in total hip replacements often reduces the accuracy of AR system image registration (29). To address this issue, researchers have attempted preoperative pelvic radiography before AR system image registration during THA by adjusting the angle of the operating table to enhance the accuracy of the AR navigation system (30). Total hip replacement poses greater challenges in special cases such as ankylosing spondylitis, where traditional surgical methods typically require repeated fluoroscopy to determine the position of the hip prosthesis. Recent studies have shown that AR can facilitate 3D imaging, enabling surgeons to observe the hip joint anatomy in real time (31). Although studies are limited, existing research confirms that AR navigation technology can achieve more precise tilt and anteversion angles in total hip arthroplasty, potentially extending the lifespan of the prosthesis (32–34).
3 Robot navigation system for total hip and knee replacement surgery
Owing to the rigid nature of bones compared with that of other body parts, the application of surgical robots in joint surgery presents challenges, making the joint replacement robot system a focal point in orthopedics (35). For total knee replacement, the navigation system acts as a guide, offering preoperative planning and assisting doctors in precisely locating the implant position (36). The primary purpose of the navigation system is to facilitate operations and enhance surgical safety, making it a crucial component of robotic surgical systems.
The Mako knee joint replacement surgical robot system, developed by Stryker, pioneered the integration of an optical external tracker into the surgical tool navigation system, accurately facilitating surgical navigation during operations (37). In 2016, Bell successfully performed a total knee replacement surgery with the assistance of the Mako surgical robot. The experimental results indicated that compared with traditional methods, the installation error in robot-assisted total knee replacement was maintained within 2° in patients (38). The French company Medtech introduced the ROSA surgical robot, which consists of a mechanical arm and navigation system featuring a 6-DOF robotic arm with tactile sensors and a touch-screen system workstation integrated into a single platform. The navigation system utilizes touch interaction technology for robot registration, target trajectory planning, and intraoperative image guidance, all of which are completed by the platform (39). Several studies confirmed that the precision of joint prosthesis placement in robot-assisted total knee replacement exceeds that in traditional surgery. Rossi et al. reported that the average error for the femoral inversion angle and femoral inclination angle in robot-assisted procedures was 0.5 ± 0.6°, better than the AR navigation system group's 0.59 ± 0.55° and 0.7 ± 0.75° (40). Hampp et al. found that the mean error value for the femoral inversion angle in robot-assisted total knee arthroplasty (0.6 ± 0.3°) was significantly lower than in the conventional surgery group (1.1 ± 1.6°) (41).
In recent years, China has made significant advancements in the development of joint-replacement surgical robots. Several domestic models such as HURWA, ARTHROBOT, Bone, and Honghu have been introduced into the market, showing promising medium- and long-term clinical outcomes. Specifically, in the HURWA robotic surgery system, the accuracy of the lower limb force line reconstruction in total knee arthroplasty patients reached 81.2%, compared with only 63.5% in the traditional surgery group. This system can safely and reliably support the execution of robot-assisted total knee replacements. The robot integrates cutting-edge technologies such as virtual fixtures and shared control, enabling precise surgical operations (42).
4 Clinical outcomes from AR and robot navigation systems
4.1 Clinical outcomes from AR in total hip and knee replacement
The effect of AR in knee replacement surgery is influenced by the early negative effect of cutting errors. Some researchers have attempted a volume-subtraction technique to decrease cutting error (43). The results showed a significant improvement in alignment accuracy from 0.55–0.64 to 0.40–0.55 mm, while the processing time decreased from 12–13 fs/s to 9–10 fs/s (43). Valenti (44) developed an automatic method for recovering the pose of the knee in 3D and ensured the implantation of the prosthesis in the correct pose. The proposed system allows successful placement of the prosthesis within the planned coordinates, with minor errors of 2 cm and 2°. In THA, the differences in radiographic inclination were significantly smaller in the AR-based portable navigation system than in the accelerometer-based portable navigation system (2.5 ± 1.7 vs. 4.6 ± 3.1) (45). Even when THA is performed with the patient in the lateral decubitus position, the AR-HIP system allows the surgeon to create a 3-dimensional coordinate system for positional information for formation of the acetabular cup with the patient in the supine position, which is very convenient for surgeons (46). One study confirmed that the measurement angle using AR-HIP was significantly more accurate in terms of radiographic anteversion than using a goniometer 3 months after THA (2.7 vs. 6.8) (47). However, the study showed that the procedure time was longer in the navigation group (95 min vs. 57 min) (48). In terms of mid-term postoperative outcomes, no differences were observed between the AR-based portable hip navigation system and the conventional technique in improvement in HOOS (27 ± 17 vs. 28 ± 19) at 6 months after THA (48).
4.2 Clinical outcomes from robot navigation systems in total hip and knee replacement
A prospective, randomized controlled trial showed that the operation time in the robot-assisted TKA group was significantly longer than that in the traditional TKA group, and no significant difference was observed in intraoperative blood loss between the two groups (49). Three months after surgery, gait analysis showed that the flexion and extension angles in the RATKA group were significantly larger than those in the traditional TKA group and that the lateral tibial component was significantly smaller in the RATKA group than in the traditional TKA group, which was closer to the ideal value (49). Robot-assisted TKA is associated with reduced postoperative pain, decreased analgesia requirements, decreased reduction in postoperative hemoglobin levels, shorter time to straight leg raise, decreased number of physiotherapy sessions, and improved maximum knee flexion at discharge compared to those in conventional TKA (50). The median time to hospital discharge in robot-assisted TKA was 77 h compared with 105 h in conventional TKA (50). Two years after TKA, the robot-assisted group displayed a trend towards higher SF-36 QoL scores, with significant differences in SF-36 vitality and role-emotional (51). A systematic review and meta-analysis showed that RATKA resulted in a better Knee Society Score than mTKA in short-to mid-term follow-up (52). Compared with conventional TKA, the RATKA improved component positioning and alignment (−1.30 to −3.02 degrees) (53).
5 Augmented reality and major problems facing surgical robot navigation systems
5.1 Technical challenges
5.1.1 Data transmission delay
The primary purpose of the navigation system is to provide the surgeon with a three-dimensional model that displays the anatomical situation and any changes in the patient's lesions, thereby facilitating precise pre-surgical planning. Additionally, the system presents real-time updates on the patient's surgical site and ongoing operation, allowing the surgeon to assess the surgical outcome as it unfolds. During surgery, it is crucial to ensure that the model is transmitted and accurately projected to the surgeon-specified location. The hardware must rapidly process and update the model information in response to the surgeon's movements and adjustments, thus demanding high real-time performance in model transmission and feedback. The speed of the processor and communication within a computer system can introduce delays in data transmission, which pose significant challenges to the clinical implementation of this technology. A single-center clinical randomized study demonstrated that using the AR navigation system in total knee arthroplasty (TKA) added an average of 5 ± 1 min to the procedure. However, the precision of the surgical incision and prosthesis placement was superior to that in the traditional surgery group. These findings underscore the potential benefits of AR navigation technology in arthroplasty (54).
5.1.2 Information processing capability of the navigation system
Both AR navigation systems and surgical robot navigation systems rely on various computer vision technologies to register the 3-dimensions (3D) model of the patient's site with real scene images. During surgery, these systems facilitate surgeon interactions and guide operations, allowing the real-time evaluation of surgical outcomes. Consequently, the internal processor of a surgical robot requires robust information processing capabilities to ensure that relevant data can be accessed in real-time. This requirement poses significant challenges for the processor and memory of surgical robot systems. As surgical complexity increases, the number of system model parameters also increases, adding to the computational load on the internal processor. This escalation demands enhanced capabilities from graphics processing units (GPUs). Future research will need to explore various optimization techniques, such as Bayesian hyperparameter optimization, integration of the transformer self-attention mechanism, and gating units, to enhance the efficiency of the internal processor model and ensure the precision and effectiveness of intraoperative navigation.
5.1.3 Image registration and tracking
The use of navigation and nail positioning is a prerequisite for operating navigation systems during joint replacement surgery. Surgical robot navigation systems typically rely on external tracking mechanisms that require fixed positioning benchmarks (navigation and positioning nails) attached to bone and surgical instruments to identify and guide robotic arm movements (55). In systems that use reflective balls for optical tracking, a benchmark must be implanted and secured to the patient's tissue, which can be invasive and physically disruptive (56). This additional step alters the standard surgical workflow, introduces additional procedural steps, extends the operation time, and increases surgical risks. Ossendorf and Jung described the world's first robot-assisted total procedure in 2006 (57). Bonutti et al. later confirmed that being overweight and obese are significant risk factors for positioning needle fractures after robot-assisted total knee arthroplasty (58). Jung et al. found that repeated intraoperative implantation of navigation and positioning nails could lead to abnormal lower limb pain (55).
The static pose needs to be further updated by dynamic target tracking because the bone inevitably moves during total hip or knee replacement. Meanwhile, in a practical setup, such a markerless tracking and registration algorithm in AR can produce unreliable registration results, particularly in rotation (59). During tracking, the initially registered bone pose is continuously updated by monitoring the optically tracked dynamic reference frame movement, which results in a long workflow (60) and possible human-induced errors (61). Markerless tracking and registration algorithms have been proposed for knee surgery (62). The segmented femur points were then registered to a pre-scanned model of the corresponding limb using the iterative closest point method in real time to obtain the spatial knee pose (63). The best-reported registration accuracies are 6.66° and 2.74 mm when the target is held static, which is not acceptable for clinical applications (64).
5.2 Clinical challenges
Even though the application accuracy can be improved by different methods to allow for optimized initial patient-to-image registration, accuracy is constantly decreasing throughout the surgical procedure and, in the worst case, might even lead to an unacceptable mismatch in AR or surgical robot navigation systems (65). Most commercially available navigation systems enable intraoperative landmark-based registration updates to overcome alterations in spatial relationships of the reference array (66). Therefore, readily available and uniquely identifiable landmarks, such as positioning nails, can be acquired and used at any time to restore accuracy if a discrepancy is observed in paired-point registration (66). However, loss of accuracy up to the point of acquisition (e.g., skin incision, interchange of reference arrays) cannot be compensated for in this way, and the same accounts for the effects of knee shift. Moreover, due to the lack of an intraoperative feedback system, especially fully automatic robots, any malfunction during the operation has serious consequences (67). Currently, most joint replacement surgery robots lack intraoperative sensory feedback systems, such as touch, toughness, and temperature, which can easily lead to accidental injuries (68).
5.3 Economic factors
The application of AR and robot assistance comes at an additional cost compared with that of the conventional technique. Although a short-term cost analysis on RA-TKA has been performed, a 90-day episode-of-care cost analysis showed that reduced costs were possible in robot-assisted TKA compared with that of the conventional technique because of fewer readmissions and economically beneficial discharge destinations (69). Markov decision analysis confirmed that a calculated surgical volume of at least 253 cases per robot per year is needed to prove cost-effectiveness, considering predetermined parameter values (70). At a minimum follow-up of 10 years, the study found no differences between robot-assisted TKA and conventional TKA in terms of functional outcome scores, aseptic loosening, overall survivorship, and complications; however, considering the additional time and expense associated with robot-assisted TKA, researchers cannot recommend its widespread use (71). Meanwhile, another study found that 51% of robotic UKA manuscripts were industry-funded or had authors with financial conflicts of interest compared with 29% of non-robotic UKA papers (72).
5.4 Surgeon training and cognitive load
A learning curve of operative time has been found with the introduction of RATKA surgery, which varies based on the surgeon's experience and volume (73). Many studies have reported the learning curves associated with the introduction of a different number of RATKA cases, ranging from 6–43 cases (36, 74, 75). One study found that the introduction of the RATKA system was associated with a learning curve for an operative time of 8.7 cases (76). The operative times were similar between the RATKA and conventional TKA groups. The short learning curve implies that the RATKA system can be adopted relatively quickly by a surgical team with minimal risk to patients (76). Meanwhile, a study showed that haptic feedback by AR can guide participants using a tool to submillimeter and subdegree accuracy with little training (77).
6 Augmented reality-robotic surgical navigation system solutions
Total hip and knee replacement surgery involves not only bone tissue but also the surrounding soft tissue. Surgical procedures lead to shifts in tissue position, which increases the complexity of image tracking. Researchers have explored various solutions to address image registration and tracking challenges in navigation systems. Mur-Artal et al. (78) proposed the ORB-SLAM 2 (Oriented FAST and Rotated BRIEF Simultaneous Localization and Mapping 2) method, which effectively eliminates accumulated errors in image tracking and can automatically relocalize after tracking failures. Bescos et al. (79) introduced a deep learning-based surgical scene segmentation method capable of identifying real-time soft tissue displacement in a dynamic environment, thereby improving synchronous localization and mapping accuracy.
The AR reality joint replacement surgical robot system effectively overcomes the reliance on navigation positioning nails for intraoperative image registration and tracking by utilizing 2D/3D medical image registration technology. The system accurately guides the surgical robot during the procedure by aligning preoperative 3D images with real-time intraoperative 2D images. Liao et al. (80) leveraged the characteristics of x-ray 2D images. They employed digitally reconstructed radiographic technology to directly measure registration errors between 3D CT images and intraoperative x-ray images, thereby enhancing positioning accuracy.
To address the issues of image transmission delays and prolonged system feedback processing times caused by the high computational complexity of the internal model in surgical robot navigation systems, researchers have explored lightweight optimization of deep learning algorithms. These improvements enhance the model's information processing capabilities, reduce system feedback time, and ensure real-time data and image transmission. Researchers successfully achieved 3D/2D image registration of spinal joints using a reinforcement learning algorithm, demonstrating a mean square error of just 0.0858 and an average processing time of only 6.54 s (81). Additionally, studies have confirmed that segmenting intraoperative 2D images using a DeepLabv3+ neural network, enhanced by an attention mechanism and contour extraction, significantly improves registration accuracy with preoperative 3D images (82).
7 Future and outlook
The AR–surgical robot navigation system significantly improves the safety and accuracy of total hip and knee replacement procedures. Unlike traditional methods, it eliminates the need for implanting positioning screws or other reference objects on the bone structure, effectively reducing the risk of serious complications such as lower limb pain and fractures. Current research on AR surgical robot navigation systems in joint surgery primarily remains at the stage of technical feasibility exploration (83). Considerable heterogeneity exists among studies. Designing multicenter, randomized controlled trials in the future should include primary clinical outcomes such as operation time, intraoperative blood loss, lower limb alignment, and prosthesis position, as well as secondary clinical outcomes such as visual analog scale scores, range of motion, function scores, and gait analysis. Meanwhile, as science and technology continue to advance, reference should be made to robot safety standards, such as ISO 13482, to standardize the clinical application of surgery robots and augmented reality (84, 85).
Author contributions
XZ: Resources, Writing – review & editing. YL: Writing – original draft. WL: Writing – original draft. YG: Supervision, Writing – review & editing.
Funding
The author(s) declare that no financial support was received for the research and/or publication of this article.
Conflict of interest
The authors declare that this study was conducted in the absence of any commercial or financial relationships that could be construed as potential conflicts of interest.
Generative AI statement
The author(s) declare that no Generative AI was used in the creation of this manuscript.
Publisher's note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
References
1. Delaney CP, Chang E, Senagore AJ, Broder M. Clinical outcomes and resource utilization associated with laparoscopic and open colectomy using a large national database. Ann Surg. (2008) 247(5):819–24. doi: 10.1097/SLA.0b013e31816d950e
2. Kim H, Grunditz JI, Meath TH, Quinones AR. Accuracy of hospital discharge codes in medicare claims for knee and hip replacement patients. Med Care. (2020) 58(5):491–5. doi: 10.1097/MLR.0000000000001290
3. Zhao B, Lei L, Xu L, Li S, Hu Y, Zhang J, et al. Needle deflection modeling and preoperative trajectory planning during insertion into multilayered tissues. IEEE/ASME Trans Mechatro. (2021) 26(2):943–54. doi: 10.1109/TMECH.2020.3013708
4. Kim H, Jung W, Chang B, Lee C, Kang K, Yeom JS. A prospective, randomized, controlled trial of robot-assisted vs freehand pedicle screw fixation in spine surgery. Int J Med Robot Comput Assisted Surg. (2017) 13(3):e1779–e1779. doi: 10.1002/rcs.1779
5. Chenin L, Capel C, Fichten A, Peltier J, Lefranc M. Evaluation of screw placement accuracy in circumferential lumbar arthrodesis using robotic assistance and intraoperative flat-panel computed tomography. World Neurosurg. (2017) 105:86–94. doi: 10.1016/j.wneu.2017.05.118
6. Cho HS, Park I-H, Jeon I-H, Kim Y-G, Han I, Kim H-S. Direct application of MR images to computer-assisted bone tumor surgery. J Orthop Sci. (2011) 16(2):190–5. doi: 10.1007/s00776-011-0035-5
7. Bargar WL. Robots in orthopaedic surgery: past, present, and future. Clin Orthop Relat Res. (2007) 463:31–6. doi: 10.1097/BLO.0b013e318146874f
8. Calanca A, Muradore R, Fiorini P. A review of algorithms for compliant control of stiff and fixed-compliance robots. IEEE/ASME Trans Mechatron. (2016) 21(2):613–24. doi: 10.1109/TMECH.2015.2465849
9. Raiola G, Cardenas CA, Tadele TS, de Vries T, Stramigioli S. Development of a safety-and energy-aware impedance controller for collaborative robots. IEEE Robot Autom Lett. (2018) 3(2):1237–44. doi: 10.1109/LRA.2018.2795639
10. Gallagher W, Gao D, Ueda J. Improved stability of haptic human-robot interfaces using measurement of human arm stiffness. Adv Robot. (2014) 28(13):869–82. doi: 10.1080/01691864.2014.900162
11. Diana M, Soler L, Agnus V, D’Urso A, Vix M, Dallemagne B, et al. Prospective evaluation of precision multimodal gallbladder surgery navigation: virtual reality, near-infrared fluorescence, and x-ray-based intraoperative cholangiography. Ann Surg. (2017) 266(5):890–7. doi: 10.1097/SLA.0000000000002400
12. Li CY, Lu ZD, Xie K, Sun HF, Tao L, Gao LG, et al. Research on the application of augmented reality in medical field. China Med Devices. (2020) 35(9):165–8. doi: 10.3969/j.issn.1674-1633.2020.09.038
13. Brigham TJ. Reality check: basics of augmented, virtual, and mixed reality. Med Ref Serv Q. (2017) 36(2):171–8. doi: 10.1080/02763869.2017.1293987
14. Hossain MS, Hardy S, Alamri A, Alelaiwi A, Hardy V, Wilhelm C. AR-based serious game framework for post-stroke rehabilitation. Multimed Syst. (2016) 22(6):659–74. doi: 10.1007/s00530-015-0481-6
16. Wang YT, Chen ZC, Li HC, Cao ZY, Luo HY, Zhang TX, et al. Movevr: enabling multiform force feedback in virtual reality using household cleaning robot. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (2020). p. 1–12
17. Nilsson NC, Peck T, Bruder G, Hodgson E, Serafin S, Whitton M, et al. 15 Years of research on redirected walking in immersive virtual environments. IEEE Comput Graph Appl. (2018) 38(2):44–56. doi: 10.1109/MCG.2018.111125628
18. Burova A, Mäkelä J, Hakulinen J, Keskinen T, Heinonen H, Siltanen S, et al. Utilizing VR and gaze tracking to develop AR solutions for industrial maintenance. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (2020). p. 1–13
19. Piumsomboon T, Lee G, Lindeman RW, Billinghurst M. Exploring natural eye-gaze-based interaction for immersive virtual reality. 2017 IEEE Symposium on 3D User Interfaces (3DUI). IEEE (2017). p. 36–9
20. Kumar YR, Babu AV, Kumar KAN, Alex JSR. Modified Viterbi Decoder for HMM Based Speech Recognition System.International Conference on Control. IEEE (2014).
21. Eric TPS, Idagene AC. Command interface and driving strategy for a voice activated endoscope positioning arm. 5th IEEE RAS/EMBS International Conference on Biomedical Robotics and Biomechatronics. Sao Paulo (2014). p. 66–9
22. Zinchenko K, Wu CY, Song KT. A study on speech recognition control for a surgical robot. IEEE Trans Industr Inform. (2017) 13(2):607–15. doi: 10.1109/TII.2016.2625818
23. Wang M, Callaghan V, Bernhardt J, White K, Peña-Rios A. Augmented reality in education and training: pedagogical approaches and illustrative case studies. J Ambient Intell Humaniz Comput. (2018) 9(5):1391–402. doi: 10.1007/s12652-017-0547-8
24. Sanna A, Manuri F, Lamberti F, Paravati G, Pezzolla P. Using handheld devices to support augmented reality-based maintenance and assembly tasks. IEEE International Conference on Consumer Electronics (2015).
25. Wang L, Xiong Y, Wang Z, Qiao Y, Lin DH, Tang XO, et al. Temporal segment networks: towards good practices for deep action recognition. European Conference on Computer Vision. Cham: Springer (2016). p. 20–36
26. Deib G, Johnson A, Unberath M, Yu K, Andress S, Qian L, et al. Image guided percutaneous spine procedures using an optical see-through head mounted display: proof of concept and rationale. J Neurointerv Surg. (2018) 10(12):1187–91. doi: 10.1136/neurintsurg-2017-013649
27. Wang QH, Zhao KH, Song GL, Zhao YW, Zhao XG. A navigation system for minimally invasive spinal surgery based on augmented reality. Robot. (2023) 45(05):546–53. doi: 10.13973/j.cnki.robot.220300
28. Fotouhi J, Alexander CP, Unberath M, Taylor G, Lee SC, Fuerst B, et al. Plan in 2D,execute in 3D: an augmented reality solution for cup placement in total hip arthroplasty. J Med Imaging (Bellingham). (2018) 5(2):021205. doi: 10.1117/1.JMI.5.2.021205
29. Iwakiri K, Kobayashi A, Ohta Y, Takaoka K. Efficacy of the anatomical-pelvicplane positioner in total hip arthroplasty in the lateral decubitus position. J Arthroplasty. (2017) 32:1520e4. doi: 10.1016/j.arth.2016.11.048
30. Tsukamoto M, Kawasaki M, Suzuki H, Fujitani T, Sakai A. Proposal of accurate cup placement procedure during total hip arthroplasty based on pelvic tilt discrepancies in the lateral position. Sci Rep. (2021) 11:13870. doi: 10.1038/s41598-021-93418-y
31. Leal J, Cullen MM, Bolognesi MP, Wellman SS, Ryan SP. Mixed reality navigation in hip fusion conversion: a novel utilization of advanced technology: a case report. JBJS Case Connect. (2024) 14(2):e24. doi: 10.2106/JBJS.CC.24.00128
32. Su S, Wang R, Chen Z, Zhou F, Zhang Y. Augmented reality-assisted versus conventional total hip arthroplasty: a systematic review and meta-analysis. J Orthop Surg Res. (2023) 18(1):920. doi: 10.1186/s13018-023-04421-0
33. Singh V, Realyvasquez J, Simcox T, Rozell JC, Schwarzkopf R, Davidovitch RI. Robotics versus navigation versus conventional total hip arthroplasty: does the use of technology yield superior outcomes? J Arthroplasty. (2021) 36(8):2801–7. doi: 10.1016/j.arth.2021.02.074
34. Wasterlain AS, Buza JA, Thakkar SC, Schwarzkopf R, Vigdorchik J. Navigation and robotics in total hip arthroplasty. JBJS Rev. (2017) 5(3):e2. doi: 10.2106/JBJS.RVW.16.00046
35. Vermue H, Luyckx T, Winnock de Grave P, Ryckaert A, Cools A, Himpe N, et al. Robot-assisted total knee arthroplasty is associated with a learning curve for surgical time but not for component alignment, limb alignment and gap balancing. Knee Surg Sports Traumatol Arthrosc. (2022) 30(2):593–602. doi: 10.1007/s00167-020-06341-6
36. Kayani B, Konan S, Huq SS, Tahmassebi J, Haddad FS. Robotic-arm assisted total knee arthroplasty has a learning curve of seven cases for integration into the surgical workflow but no learning curve effect for accuracy of implant positioning. Knee Surg Sports Traumatol Arthrosc. (2019) 27(4):1132–41. doi: 10.1007/s00167-018-5138-5
37. Joskowicz L, Hazan EJ. Computer aided orthropedic surgery: incremental shift or paradigm change? Med Image Anal. (2016) 33:84–90. doi: 10.1016/j.media.2016.06.036
38. Jin G, Fan Y, Jiang L, Chen Z, Wang C. MAKO robot-assisted total knee arthroplasty cannot reduce the aggravation of ankle varus incongruence after genu varus correction≥10°: a radiographic assessment. BMC Musculoskelet Disord. (2023) 24(1):492. doi: 10.1186/s12891-023-06597-2
39. Strazdas D, Hintz J, Khalifa A, Abdelrahman AA, Hempel T, Al-Hamadi A. Robot system assistant (RoSA): towards intuitive multi-modal and multi-device human-robot interaction. Sensors. (2022) 22(3):923. doi: 10.3390/s22030923
40. Rossi SMP, Sangaletti R, Perticarini L, Terragnoli F, Benazzo F. High accuracy of a new robotically assisted technique for total knee arthroplasty: an in vivo study. Knee Surg Sports Traumatol Arthrosc. (2023) 31:1153. doi: 10.1007/s00167-021-06800-8
41. Hampp EL, Chughtai M, Scholl LY, Sodhi N, Bhowmik-Stoker M, Jacofsky DJ, et al. Robotic-arm assisted total knee arthroplasty demon strated greater accuracy and precision to plan compared with manual techniques. J Knee Surg. (2019) 32:239. doi: 10.1055/s-0038-1641729
42. Li Z, Chen X, Wang X, Zhang B, Wang W, Fan Y, et al. HURWA robotic-assisted total knee arthroplasty improves component positioning and alignment-A prospective randomized and multicenter study. J Orthop Translat. (2022) 33:31–40. doi: 10.1016/j.jot.2021.12.004
43. Xia R, Zhai Z, Zhang J, Yu D, Wang L, Mao Y, et al. Verification and clinical translation of a newly designed “skywalker” robot for total knee arthroplasty: a prospective clinical study. J Orthop Translat. (2021) 29:143–51. doi: 10.1016/j.jot.2021.05.006
44. Pokhrel S, Alsadoon A, Prasad PWC, Paul M. A novel augmented reality (AR) scheme for knee replacement surgery by considering cutting error accuracy. Int J Med Robot. (2019) 15(1):e1958. doi: 10.1002/rcs.1958
45. Fushima K, Kobayashi M. "Mixed-Reality Simulation for Orthognathicsurgery,” (2016). Available online at: https://link.springer.com/article/10.1186/s40902-016-0059-z (Accessed November 13, 2016).
46. Tsukada S, Ogawa H, Hirasawa N, Nishino M, Aoyama H, Kurosaka K. Augmented reality- vs accelerometer-based portable navigation system to improve the accuracy of acetabular cup placement during total hip arthroplasty in the lateral decubitus position. J Arthroplasty. (2022) 37(3):488–94. doi: 10.1016/j.arth.2021.11.004
47. Ogawa H, Kurosaka K, Sato A, Hirasawa N, Matsubara M, Tsukada S. Does an augmented reality-based portable navigation system improve the accuracy of acetabular component orientation during THA? A RCT Clin Orthop Relat Res. (2020) 478:935e43. doi: 10.1097/CORR.0000000000001083
48. Ogawa H, Hasegawa S, Tsukada S, Matsubara M. A pilot study of augmented reality technology applied to the acetabular cup placement during total hip arthroplasty. J Arthroplasty. (2018) 33(6):1833–7. doi: 10.1016/j.arth.2018.01.067
49. Tanino H, Mitsutake R, Takagi K, Ito H. Does a commercially available augmented reality-based portable hip navigation system improve cup positioning during THA compared with the conventional technique? A randomized controlled study. Clin Orthop Relat Res. (2024) 482(3):458–67. doi: 10.1097/CORR.0000000000002819
50. Yuan M, Shi X, Su Q, Wan X, Zhou Z. A prospective randomized controlled trial on the short-term effectiveness of domestic robot-assisted total knee arthroplasty. Chin J Reparative Reconstr Surg. (2021) 35(10):1251–8. doi: 10.7507/1002-1892.202106047
51. Kayani B, Konan S, Tahmassebi J, Pietrzak JRT, Haddad FS. Robotic-arm assisted total knee arthroplasty is associated with improved early functional recovery and reduced time to hospital discharge compared with conventional jig-based total knee arthroplasty: a prospective cohort study. Bone Joint J. (2018) 100-B(7):930–7. doi: 10.1302/0301-620X.100B7.BJJ-2017-1449.R1
52. Liow MHL, Goh GS, Wong MK, Chin PL, Tay DK, Yeo SJ. Robotic-assisted total knee arthroplasty may lead to improvement in quality-of-life measures: a 2-year follow-up of a prospective randomized trial. Knee Surg Sports Traumatol Arthrosc. (2017) 25(9):2942–51. doi: 10.1007/s00167-016-4076-3
53. Zhang J, Ndou WS, Ng N, Gaston P, Simpson PM, Macpherson GJ, et al. Robotic-arm assisted total knee arthroplasty is associated with improved accuracy and patient reported outcomes: a systematic review and meta-analysis. Knee Surg Sports Traumatol Arthrosc. (2022) 30(8):2677–95. doi: 10.1007/s00167-021-06464-4
54. Kort N, Stirling P, Pilot P, Müller JH. Robot-assisted knee arthroplasty improves component positioning and alignment, but results are inconclusive on whether it improves clinical scores or reduces complications and revisions: a systematic overview of meta-analyses. Knee Surg Sports Traumatol Arthrosc. (2022) 30(8):2639–53. doi: 10.1007/s00167-021-06472-4
55. Castellarin G, Bori E, Barbieux E, Grandjean VP, Jost G, Innocenti B. Is total knee arthroplasty surgical performance enhanced using augmented reality? A single-center study on 76 consecutive patients. J Arthroplasty. (2024) 39(2):332–5. doi: 10.1016/j.arth.2023.08.013
56. Wu J-R, Wang M-L, Liu K-C, Hu M-H, Lee P-Y. Real-time advanced spinal surgery via visible patient model and augmented reality system. Comput Methods Prog Biomed. (2014) 113(3):869–81. doi: 10.1016/j.cmpb.2013.12.021
57. Elmi-Terander A, Burström G, Nachabe R, Skulason H, Pedersen K, Fagerlund M, et al. Pedicle screw placement using augmented reality surgical navigation with intraoperative 3D imaging: a first in-human prospective cohort study. Spine. (2019) 44(7):517–25. doi: 10.1097/BRS.0000000000002876
58. Ossendorf C, Fuchs B, Koch P. Femoral stress fracture after computer navigated total knee arthroplasty. Knee. (2006) 13:397–9. doi: 10.1016/j.knee.2006.06.002
59. Bonutti P, Dethmers D, Stiehl JB. Case report: femoral shaft fracture resulting from femoral tracker placement in navigated TKA. Clin Orthop Relat Res. (2008) 466:1499–502. doi: 10.1007/s11999-008-0150-6
60. Hu X, Liu H, Baena FRY. Markerless navigation system for orthopaedic knee surgery: a proof of concept study. IEEE Access. (2021) 9:64708–18. doi: 10.1109/ACCESS.2021.3075628
61. Beringer DC, Patel JJ, Bozic KJ. An overview of economic issues in computer-assisted total joint arthroplasty. Clin. Orthopaedics Rel Res. (2007) 463:26–30. doi: 10.1097/BLO.0b013e318154addd
62. Schulz AP, Seide K, Queitsch C, von Haugwitz A, Meiners J, Kienast B, et al. Results of total hip replacement using the robodoc surgical assistant system: clinical outcome and evaluation of complications for 97 procedures. Int J Med Robot Comput Assist Surg. (2007) 3(4):301–6. doi: 10.1002/rcs.161
63. Chen Y, Medioni G. Object modeling by registration of multiple range images. Image Vis Comput. (1992) 10(3):145–55. doi: 10.1016/0262-8856(92)90066-C
64. Liu H, Baena FRY. Automatic markerless registration and tracking of the bone for computer-assisted orthopaedic surgery. IEEE Access. (2020) 8:42010–20. doi: 10.1109/ACCESS.2020.2977072
65. Rodrigues P, Antunes M, Raposo C, Marques P, Fonseca F, Barreto JP. Deep segmentation leverages geometric pose estimation in computer-aided total knee arthroplasty. Healthcare Technol Lett. (2019) 6(6):226–30. doi: 10.1049/htl.2019.0078
66. Kantelhardt SR, Gutenberg A, Neulen A, Keric N, Renovanz M, Giese A. Video-assisted navigation for adjustment of image-guidance accuracy to slight brain shift. Oper Neurosurg. (2015) 11:504–11. doi: 10.1227/NEU.0000000000000921
67. Stieglitz LH, Raabe A, Beck J. Simple accuracy enhancing techniques in neuronavigation. World Neurosurg. (2015) 84:580–4. doi: 10.1016/j.wneu.2015.03.025
68. Baek JH, Lee SC, Kim JH, Ahn HS, Nam CH. Distal femoral tracker pin placement prevents delayed pin tract-induced fracture in robotic-assisted total knee arthroplasty: results of minimum 1-year follow-up. J Knee Surg. (2023) 36(10):1102–4. doi: 10.1055/s-0042-1749605
69. Sires JD, Craik JD, Wilson CJ. Accuracy of bone resection in MAKO total knee robotic-assisted surgery. J Knee Surg. (2021) 34(7):745–8. doi: 10.1055/s-0039-1700570
70. Cool CL, Jacofsky DJ, Seeger KA, Sodhi N, Mont MA. A 90-day episode-of-care cost analysis of robotic-arm assisted total knee arthroplasty. J Comp Eff Res. (2019) 8:327–36. doi: 10.2217/cer-2018-0136
71. Vermue H, Tack P, Gryson T, Victor J. Can robot-assisted total knee arthroplasty be a cost-effective procedure? A Markov decision analysis. Knee. (2021) 29:345–52. doi: 10.1016/j.knee.2021.02.004
72. Kim YH, Yoon SH, Park JW. Does robotic-assisted TKA result in better outcome scores or long-term survivorship than conventional TKA? A randomized, controlled trial. Clin Orthop Relat Res. (2020) 478(2):266–75. doi: 10.1097/CORR.0000000000000916
73. Cavinatto L, Bronson MJ, Chen DD, Moucha CS. Robotic-assisted versus standard unicompartmental knee arthroplasty-evaluation of manuscript conflict of interests, funding, scientific quality and bibliometrics. Int Orthop. (2019) 43(8):1865–71. doi: 10.1007/s00264-018-4175-5
74. Mahure SA, Teo GM, Kissin YD, Stulberg BN, Kreuzer S, Long WJ. Learning curve for active robotic total knee arthroplasty. Knee Surg Sports Traumatol Arthrosc. (2022) 30(8):2666–76. doi: 10.1007/s00167-021-06452-8
75. Bouché PA, Corsia S, Dechartres A, Resche-Rigon M, Nizard R. Are there differences in accuracy or outcomes scores among navigated, robotic, patient-specific instruments or standard cutting guides in TKA? A network meta-analysis. Clin Orthop Relat Res. (2020) 478(9):2105–16. doi: 10.1097/CORR.0000000000001324
76. Naziri Q, Cusson BC, Chaudhri M, Shah NV, Sastry A. Making the transition from traditional to robotic-arm assisted TKA: what to expect? A single-surgeon comparative-analysis of the first-40 consecutive cases. J Orthop. (2019) 16(4):364–8. doi: 10.1016/j.jor.2019.03.010
77. Bolam SM, Tay ML, Zaidi F, Sidaginamale RP, Hanlon M, Munro JT, et al. Introduction of ROSA robotic-arm system for total knee arthroplasty is associated with a minimal learning curve for operative time. J Exp Orthop. (2022) 9(1):86. doi: 10.1186/s40634-022-00524-5
78. Zhang G, Bartels J, Martin-Gomez A, Armand M. Towards reducing visual workload in surgical navigation: proof-of-concept of an augmented reality haptic guidance system. Comput Methods Biomech Biomed Eng Imaging Vis. (2023) 11(4):1073–80. doi: 10.1080/21681163.2022.2152372
79. Mur-Artal R, Tardos JD. ORB-SLAM2: an open-source SLAM system for monocular, stereo, and RGB-D cameras. IEEE Trans Robot. (2017) 33(5):1255–62. doi: 10.1109/TRO.2017.2705103
80. Bescos B, Facil JM, Civera J, Neira J. DynaSLAM: tracking, mapping, and inpainting in dynamic scenes. IEEE Robot Autom Lett. (2018) 3(4):4076–83. doi: 10.1109/LRA.2018.2860039
81. Liao HF, Lin WA, Zhang JR, Zhang JD, Luo JB, Zhou SK. Multiview 2d/3d rigid registration via a point-of-interest network for tracking and triangulation. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (2019). p. 12638–47
82. Balakrishnan G, Zhao A, Sabuncu MR, Guttag J, Dalca AV. Voxelmorph:a learning framework for deformable medical image registration. IEEE Trans Med Imaging. (2019) 38(8):1788–800. doi: 10.1109/TMI.2019.2897538
83. Meng C, Wang Q, Guan S, Sun K, Liu B. 2D-3D registration with weighted local mutual information in vascular interventions. IEEE Access. (2019) 7:162629–38. doi: 10.1109/ACCESS.2019.2905345
84. Chen YF, Li Y, Li X, Chen Y, Zou Q, Kong YR, et al. Analysis on the characteristics of virtual reality, augmented reality and mixed reality related clinical trials registered in Chinese clinical trial registry. Chongqing Med. (2022) 51(4):697–701. doi: 10.3969/j.issn.1671-8348.2022.04.031
Keywords: total hip replacement, total knee replacement, augmented reality, surgical robot, operative navigation
Citation: Zhang X, Liu Y, Luo W and Guo Y (2025) Application of augmented reality and surgical robotic navigation in total hip and knee replacement. Front. Surg. 12:1591756. doi: 10.3389/fsurg.2025.1591756
Received: 12 March 2025; Accepted: 8 July 2025;
Published: 28 July 2025.
Edited by:
Shayan Gholizadeh, Brigham and Women's Hospital and Harvard Medical School, United StatesReviewed by:
Khadeja Alrefaie, Royal College of Surgeons in Ireland, BahrainCopyright: © 2025 Zhang, Liu, Luo and Guo. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Yansu Guo, Z3lzMTg4QDE2My5jb20=