Your new experience awaits. Try the new design now and help us make it even better

MINI REVIEW article

Front. Virtual Real.

Sec. Virtual Reality and Human Behaviour

This article is part of the Research TopicExploring Meaningful Extended Reality (XR) Experiences: Psychological, Educational, and Data-Driven PerspectivesView all 13 articles

Uncovering the Mechanisms of Virtual Reality (VR)-Enhanced Neuroscience Education: The Role of Eye Tracking and Facial Expression Recognition

Provisionally accepted
  • Division of Occupational Therapy, Decker College of Nursing and Health Sciences, Binghamton University, Binghamton, United States

The final, formatted version of the article will be published soon.

Virtual reality (VR) has demonstrated substantial advantages in neuroscience education, consistently outperforming traditional instructional approaches in enhancing spatial understanding, knowledge retention, and learner engagement. However, despite this robust evidence of effectiveness, most existing studies remain primarily outcome-oriented, focusing on what learners achieve over how learning processes occur within immersive VR environments. Consequently, the cognitive and affective mechanisms that mediate VR-enhanced learning outcomes remain underexplored. This mini-review synthesizes current evidence on the integration of biometric sensing technologies—specifically eye tracking and facial expression recognition—in VR-based neuroscience education to elucidate the cognitive and affective processes. We examine how eye tracking provides objective indicators of visual attention and cognitive load, while facial expression analysis captures affective states such as curiosity and frustration. The integration of this multimodal data offers a holistic framework to understand the interplay between immersion, attention, and emotion in knowledge acquisition. Furthermore, we discuss significant technical and ethical challenges, including data synchronization, privacy, and measurement reliability. Finally, we outline future directions, emphasizing the potential for artificial intelligence (AI) to create adaptive VR learning systems that respond in real-time to learner biomarkers. This approach lays the groundwork for more mechanism-informed and evidence-aligned design of adaptive XR learning environments.

Keywords: affective learning, Biometric sensing, Cognitive Load, eye tracking, facial expression recognition, Immersive Learning, Neuroscience education, virtual reality

Received: 07 Nov 2025; Accepted: 03 Feb 2026.

Copyright: © 2026 Deng. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

* Correspondence: Xue Deng

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.