About this Research Topic
There is a growing number of eye-tracking devices in military and industry contexts such as in head-mounted displays for AR/VR, marketing research, gaming, and human factors assessments. However, the challenge that cuts across these fields is generalizing knowledge products and models of eye-linked
cognitive processes, developed in the laboratory, to more complex and real-world environments.
Challenges associated with real-world studies include, but are not limited to, an increase in the complexity and temporal dynamics of sensory stimulation and behavioral events, a reduction in control over visual attention due to free-viewing and less constrained scenarios, and a variety of technical issues
associated with data collections outside of the constraints of the laboratory.
There is an ongoing need for theoretical models and algorithmic methods that are robust to these issues and are capable of accounting for the influence of non-cognitive factors on pupillary responses. Developing these techniques will enable more accurate and valid measures of cognitive states and unlock
the capability to better leverage eye-tracking data in the wild. We anticipate these developments will lead to various real-world applications including individually adaptive interfaces, learning systems, and teaming
with intelligent agents via an increased understanding of human-states.
This Research Topic aims to explore advances in increasing the usability of pupillometry and eye movement data to make inferences about behavior and cognition in real-world contexts or in complex
environments with ecologically valid stimuli (i.e. in VR/AR).
We welcome reviews and submissions related, but not limited to:
- Using eye-tracking data to make inferences of cognitive processes during complex and dynamic real-world tasks such as a navigation scenario or tasks involving teamwork.
- Preprocessing and algorithmic methods that can account for, or are robust to, non-cognitive influences on pupil size due to eye movements (i.e., blinks, saccades) and/or time-varying stimulus properties (i.e., depth, luminance, size, eccentricity, color spectrum).
- Demonstrate technical approaches to reduce the impact of increased noise or data loss in complex environments outside the laboratory.
- Novel or improved methods to collect eye-tracking data in constrained (e.g., webcam-based eye tracking) and unconstrained mobile environments (e.g., using wearable eye trackers).
- Using eye-tracking effectively in experiments performed in a virtual environment.
- Deploying AI/ML approaches to interpret eye-tracking data and predict states or performance variables from one or multiple participants (i.e., dyad, group, team).
- Demonstrate the integration of multi-modal data that highlights the unique advantages of eye-tracking data in applied contexts (e.g., adaptive interfaces)
-Demonstrate extended use of eye-tracking technology to collect longitudinal and/or continuous data.
Conflict of Interest Statement: Topic Editors Russell Cohen Hoffing, Steven Thurman, Jonathan Touryan and Josef Faller are employed by US CCDC Army Research Laboratory. The Topic Editors declare the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Keywords: cognitive science, cognition, eye-tracking, pupillometry, BCI, real-world, ecologically, valid, data fusion, machine learning, HUD, virtual reality, VR, augmented reality, AR
Important Note: All contributions to this Research Topic must be within the scope of the section and journal to which they are submitted, as defined in their mission statements. Frontiers reserves the right to guide an out-of-scope manuscript to a more suitable section or journal at any stage of peer review.