ORIGINAL RESEARCH article
Front. Virtual Real.
Sec. Virtual Reality and Human Behaviour
Volume 6 - 2025 | doi: 10.3389/frvir.2025.1623764
Classifying Interpersonal Interaction in Virtual Reality: Sensor-Based Analysis of Human Interaction with Non-Responsive Avatars
Provisionally accepted- Kyoto University of Advanced Science (KUAS), Kyoto, Japan
Select one of your emails
You have multiple emails registered with Frontiers:
Notify me on publication
Please enter your email address:
If you already have an account, please login
You don't have a Frontiers account ? You can register here
This study investigates human engagement with a non-responsive, pre-recorded avatar in VR environments. Rather than bidirectional collaboration, we focus on unidirectional synchrony from human participants to the avatar and evaluate its detectability using sensor-based machine learning. Using a random forest model, we classified interactions into cooperation, conformity, and competition, achieving an F1 score of 0.89. Feature importance analysis identified hand rotation and head position as key predictors of interaction states. We compared human-human and human interaction with a non-responsive avatar (pre-recorded motion replay) during a joint Simon task by covertly switching collaborators between humans and non-responsive avatars. Using the classification model, a synchrony index was derived from VR motion data to quantify behavioral coordination patterns during joint actions. The classification indexes were associated with higher cooperation in human-human interactions (p = 0.0262) and greater conformity in human interaction with a non-responsive avatar (p = 0.0034). The synchrony index was significantly lower in the non-responsive avatar condition (p < 0.001), indicating reduced interpersonal synchrony with non-responsive avatars. These findings demonstrate the feasibility of using VR sensor data and machine learning to quantify social interaction dynamics. This study aimed to explore the feasibility of a sensor-based machine learning model for classifying interpersonal interactions in VR, based on preliminary data from small-sample experiments.
Keywords: virtual reality, Human activity recognition, machine learning, joint Simon effect, Interpersonal synchrony
Received: 06 May 2025; Accepted: 19 Aug 2025.
Copyright: © 2025 Arima, Harada and Okada. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
* Correspondence: Yoshiko Arima, Kyoto University of Advanced Science (KUAS), Kyoto, Japan
Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.