Your new experience awaits. Try the new design now and help us make it even better

ORIGINAL RESEARCH article

Front. Virtual Real.

Sec. Haptics

This article is part of the Research TopicComputational Multimodal Sensing and Perception for Robotic SystemsView all articles

"Did she just tap me?": Qualifying Multisensory Feedback for Social Touch during Human-Agent Interaction in Virtual Reality

Provisionally accepted
Grégoire  RichardGrégoire Richard1,2*Fabien  BoucaudFabien Boucaud3,4Catherine  PelachaudCatherine Pelachaud3,4,5Indira  ThouveninIndira Thouvenin2,3
  • 1Heuristics and Diagnostics for Complex Systems, University of Technology Compiegne, Compiègne, France
  • 2CNRS Delegation - Hauts-de-France, Lille, France
  • 3Institut des Systemes Intelligents et de Robotique, Paris, France
  • 4CNRS Delegation Paris B, Paris, France
  • 5Sorbonne Universite, Paris, France

The final, formatted version of the article will be published soon.

Social touch is one of the many modalities of social communication. It can take many forms (from handshakes to hugs) and can convey a wide variety of emotions and intentions. Social touch is also multimodal: it is not only comprised of haptic feedback (both tactile and kinesthetic), but also visual feedback (gestures) and even audio feedback (sound of the hand movement on the body). Virtual agents (VA) can perceive and interact with users by making use of multimodality to express attitudes and complex emotions. There are still few studies that have investigated how to integrate touch into VAs' social abilities, despite a growing interest in haptic interactions within immersive virtual environments (IVE). While prior work has examined haptic feedback, auditory substitution, or agent-based touch in isolation, no study has systematically disentangled the respective and combined contributions of visual, auditory, and tactile cues to the perception of being socially touched by a virtual agent. To address this gap, we conducted three experiments that progressively isolate and combine modalities, revealing how each shapes touch recognition, believability, and agency attribution. Taken together, our results show that multisensory feedback improves experience of social touch, which is in line with previous research in IVEs. They also show that visual feedback has the potential to guide the recognition of the stimulus the most but both audio and tactile feedback further help disambiguate the recognition of a touch gesture in particular cases. The results also show that both level of animation and interpersonal distance are essential to how much the VA is felt as a social agent when initiating touch. While visual feedback and tactile feedback are the main contributors to participants feeling touched by the VA, audio feedback also has a significant impact.

Keywords: virtual reality, Haptic Feedback, social touch, human-agent interaction, tactile & visuo-tactile, tactile & audio-tactile

Received: 03 Oct 2025; Accepted: 27 Nov 2025.

Copyright: © 2025 Richard, Boucaud, Pelachaud and Thouvenin. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

* Correspondence: Grégoire Richard

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.