ORIGINAL RESEARCH article
Front. Virtual Real.
Sec. Virtual Reality and Human Behaviour
Volume 6 - 2025 | doi: 10.3389/frvir.2025.1567854
AffectTracker: Real-time continuous rating of affective experience in immersive Virtual Reality
Provisionally accepted- 1Department of Neurology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
- 2Max Planck School of Cognition, Leipzig, Lower Saxony, Germany
- 3Max Planck Dahlem Campus of Cognition, Max Planck Society, Berlin, Germany
- 4Charité University Medicine Berlin, Berlin, Baden-Wurttemberg, Germany
- 5Department of Clinical and Biological Sciences, University of Turin, Turin, Piedmont, Italy
Select one of your emails
You have multiple emails registered with Frontiers:
Notify me on publication
Please enter your email address:
If you already have an account, please login
You don't have a Frontiers account ? You can register here
Subjective experience is key to understanding affective states, characterized by valence and arousal. Traditional experiments using post-stimulus summary ratings do not resemble natural behavior. Fluctuations of affective states can be explored with dynamic stimuli, such as videos.Continuous ratings can capture moment-to-moment affective experience, however the rating or the feedback can be interfering.We designed, empirically evaluated, and openly share AffectTracker, a tool to collect continuous ratings of two-dimensional affective experience (valence and arousal) during dynamic stimulation, such as 360-degree videos in immersive virtual reality. AffectTracker comprises three customizable feedback options: a simplified affect grid (Grid), an abstract pulsating variant (Flubber), and no visual feedback.Two studies with healthy adults were conducted, each at two sites (Berlin, Germany, and Torino, Italy). In Study 1 (Selection: n=51), both Grid and Flubber demonstrated high user experience and low interference in repeated 1-min 360-degree videos. Study 2 (Evaluation: n=82) confirmed these findings for Flubber with a longer (23-min), more varied immersive experience, maintaining high user experience and low interference.Continuous ratings collected with AffectTracker effectively captured valence and arousal variability. For shorter, less eventful stimuli, their correlation with post-stimulus summary ratings demonstrated the tool's validity; for longer, more eventful stimuli, it showed the tool's benefits of capturing additional variance.Our findings suggest that AffectTracker provides a reliable, minimally interfering method to gather moment-to-moment affective experience also in immersive environments, offering new research opportunities to link affective states and physiological dynamics.
Keywords: affective states, emotion, virtual reality, dynamics, moment-to-moment, selfreports
Received: 28 Jan 2025; Accepted: 26 Aug 2025.
Copyright: © 2025 Fourcade, Malandrone, Roellecke, Ciston, de Mooij, Villringer, Carletto and Gaebler. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
* Correspondence:
Antonin Fourcade, Department of Neurology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
Francesca Malandrone, Department of Clinical and Biological Sciences, University of Turin, Turin, 10124, Piedmont, Italy
Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.