Immersive and Extended Reality technologies have gone through remarkable progress in recent years widening their audience and applications. Extended Reality (XR) has emerged as terminology to include virtual reality (VR), augmented reality (AR), and mixed reality (MR), confirming the extraordinary power those technologies have in supporting actions requiring enhanced presence, telepresence, and teleoperation [1,2,3].
Extended reality interfaces can increase user performance in mediated observations, explorations, monitoring, and interventions [4]. XR can also be combined with artificial intelligence algorithms and methods (including machine and deep learning) to further empower processing, visualization, and decision-making.
In this Special Issue, you are invited to submit contributions describing novel approaches and the use of extended reality interfaces in various applications. Contributions may also address the use of artificial intelligence methods in XR.
Potential topics include but are not limited to the following:
- Immersive and MR interfaces for exploration, monitoring, and intervention
- Immersive/Extended telepresence and teleoperation
- XR in human-machine interaction and multimodal control
- Sensors for XR and multi-sensor stimulation
- AI-assisted monitoring and tele-control
[1] S. Livatino et al., "Intuitive Robot Teleoperation Through Multi-Sensor Informed Mixed Reality Visual Aids," in IEEE Access, vol. 9, pp. 25795-25808, 2021, doi: 10.1109/ACCESS.2021.3057808.
[2] Luo et al., "Monoscopic vs. Stereoscopic Views and Display Types in the Teleoperation of Unmanned Ground Vehicles for Object Avoidance," 2021 30th IEEE International Conference on Robot & Human Interactive Communication (RO-MAN), 2021, pp. 418-425, doi: 10.1109/RO-MAN50785.2021.9515455.
[3] Luo et al., "In-Device Feedback in Immersive Head-Mounted Displays for Distance Perception During Teleoperation of Unmanned Ground Vehicles," in IEEE Transactions on Haptics, vol. 15, no. 1, pp. 79-84, 1 Jan.-March 2022, doi: 10.1109/TOH.2021.3138590.
[4] S. Livatino et al., "Stereoscopic Visualization and 3-D Technologies in Medical Endoscopic Teleoperation," in IEEE Transactions on Industrial Electronics, vol. 62, no. 1, pp. 525-535, Jan. 2015, doi: 10.1109/TIE.2014.2334675.
Keywords:
Extended Reality(XR), Virtual Reality (VR), Mixed Reality (MR), Teleoperation, Telepresence, Interfaces, Mediated Observations
Important Note:
All contributions to this Research Topic must be within the scope of the section and journal to which they are submitted, as defined in their mission statements. Frontiers reserves the right to guide an out-of-scope manuscript to a more suitable section or journal at any stage of peer review.
Immersive and Extended Reality technologies have gone through remarkable progress in recent years widening their audience and applications. Extended Reality (XR) has emerged as terminology to include virtual reality (VR), augmented reality (AR), and mixed reality (MR), confirming the extraordinary power those technologies have in supporting actions requiring enhanced presence, telepresence, and teleoperation [1,2,3].
Extended reality interfaces can increase user performance in mediated observations, explorations, monitoring, and interventions [4]. XR can also be combined with artificial intelligence algorithms and methods (including machine and deep learning) to further empower processing, visualization, and decision-making.
In this Special Issue, you are invited to submit contributions describing novel approaches and the use of extended reality interfaces in various applications. Contributions may also address the use of artificial intelligence methods in XR.
Potential topics include but are not limited to the following:
- Immersive and MR interfaces for exploration, monitoring, and intervention
- Immersive/Extended telepresence and teleoperation
- XR in human-machine interaction and multimodal control
- Sensors for XR and multi-sensor stimulation
- AI-assisted monitoring and tele-control
[1] S. Livatino et al., "Intuitive Robot Teleoperation Through Multi-Sensor Informed Mixed Reality Visual Aids," in IEEE Access, vol. 9, pp. 25795-25808, 2021, doi: 10.1109/ACCESS.2021.3057808.
[2] Luo et al., "Monoscopic vs. Stereoscopic Views and Display Types in the Teleoperation of Unmanned Ground Vehicles for Object Avoidance," 2021 30th IEEE International Conference on Robot & Human Interactive Communication (RO-MAN), 2021, pp. 418-425, doi: 10.1109/RO-MAN50785.2021.9515455.
[3] Luo et al., "In-Device Feedback in Immersive Head-Mounted Displays for Distance Perception During Teleoperation of Unmanned Ground Vehicles," in IEEE Transactions on Haptics, vol. 15, no. 1, pp. 79-84, 1 Jan.-March 2022, doi: 10.1109/TOH.2021.3138590.
[4] S. Livatino et al., "Stereoscopic Visualization and 3-D Technologies in Medical Endoscopic Teleoperation," in IEEE Transactions on Industrial Electronics, vol. 62, no. 1, pp. 525-535, Jan. 2015, doi: 10.1109/TIE.2014.2334675.
Keywords:
Extended Reality(XR), Virtual Reality (VR), Mixed Reality (MR), Teleoperation, Telepresence, Interfaces, Mediated Observations
Important Note:
All contributions to this Research Topic must be within the scope of the section and journal to which they are submitted, as defined in their mission statements. Frontiers reserves the right to guide an out-of-scope manuscript to a more suitable section or journal at any stage of peer review.