Your new experience awaits. Try the new design now and help us make it even better

BRIEF RESEARCH REPORT article

Front. Hum. Neurosci.

Sec. Brain-Computer Interfaces

This article is part of the Research TopicBrain-Computer Interfaces (BCIs) for daily activities: Innovations in EEG signal analysis and machine learning approachesView all 4 articles

NeuroGaze: A Hybrid EEG and Eye-Tracking Brain-Computer Interface for Hands-Free Interaction in Virtual Reality

Provisionally accepted
Kyle  CoutrayKyle Coutray*Wanyea  BarbelWanyea BarbelZack  GrothZack GrothJoseph  J LaViola JrJoseph J LaViola Jr*
  • University of Central Florida, Orlando, United States

The final, formatted version of the article will be published soon.

Brain-Computer Interfaces (BCIs) have traditionally been studied in clinical and laboratory contexts, but the rise of consumer-grade devices now allows exploration of their use in daily activities. Virtual reality (VR) provides a particularly relevant domain, where existing input methods often force trade-offs between speed, accuracy, and physical effort. This study introduces NeuroGaze, a hybrid interface combining electroencephalography (EEG) with eye tracking to enable hands-free interaction in immersive VR. Twenty participants completed a 360° cube-selection task using three different input methods: VR controllers, gaze combined with a pinch gesture, and NeuroGaze. Performance was measured by task completion time and error rate, while workload was evaluated using the NASA Task Load Index (NASA-TLX). NeuroGaze successfully supported target selection with off-the-shelf hardware, producing fewer errors than the alternative methods but requiring longer completion times, reflecting a classic speed-accuracy tradeoff. Workload analysis indicated reduced physical demand for NeuroGaze compared to controllers, though overall ratings and user preferences were mixed. While the differing confirmation pipelines limit direct comparison of throughput metrics, NeuroGaze is positioned as a feasibility study illustrating trade-offs between speed, accuracy, and accessibility. It highlights the potential of consumer-grade BCIs for long-duration use and emphasizes the need for improved EEG signal processing and adaptive multimodal integration to enhance future performance.

Keywords: brain-computer interface (BCI), Electroencephalography (EEG), eye tracking, Virtual reality (VR), hybrid interfaces, hands-free interaction, Human-computer interaction (HCI), accessibility

Received: 29 Aug 2025; Accepted: 31 Oct 2025.

Copyright: © 2025 Coutray, Barbel, Groth and LaViola Jr. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

* Correspondence:
Kyle Coutray, ky447620@ucf.edu
Joseph J LaViola Jr, jlaviola@ucf.edu

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.