CiteScore 3.36
More on impact ›

Original Research ARTICLE Provisionally accepted The full-text will be published soon. Notify me

Front. Robot. AI | doi: 10.3389/frobt.2019.00106

Help! I Need a Remote Guide in my Mixed Reality Collaborative Environment

Morgan Le Chénéchal1*,  Thierry Duval2, 3,  Valérie Gouranton1, 4, 5,  Jérôme Royan1 and Bruno Arnaldi1, 4, 5
  • 1IRT b<>com, France
  • 2IMT Atlantique Bretagne-Pays de la Loire, France
  • 3UMR6285 Laboratoire des Sciences et Techniques de l'Information, de la Communication et de la Connaissance (LAB-STICC), France
  • 4Institut national des sciences appliquées de Rennes, France
  • 5UMR6074 Institut de Recherche en Informatique et Systèmes Aléatoires (IRISA), France

The help of a remote expert in performing a maintenance task can be useful in many situations and can save time as well as money. In this context, augmented reality (AR) technologies can improve remote guidance thanks to the direct overlay of 3D information onto the real world. Furthermore, virtual reality (VR) enables a remote expert to virtually share the place in which physical maintenance is being carried out. In a traditional local collaboration, collaborators are face-to-face and are observing the same artifact, while being able to communicate verbally and use body language such as gaze direction or facial expression. These interpersonal communication cues are usually limited in remote collaborative maintenance scenarios, in which the agent uses an AR setup while the remote expert uses VR. Providing users with adapted interaction and awareness features to compensate for the lack of essential communication signals is, therefore, a real challenge for remote MR collaboration. However, this context offers new opportunities for augmenting collaborative abilities, such as sharing an identical point of view, which is not possible in real life. Based on the current task of the maintenance procedure, such as navigation to the correct location or physical manipulation, the remote expert may choose to freely control his/her own viewpoint of the distant workspace or instead may need to share the viewpoint of the agent in order to better understand the current situation. In this work, we first focus on the navigation task, which is essential to complete the diagnostic phase and to begin the maintenance task in the correct location. We then present a novel interaction paradigm, implemented in an early prototype, in which the guide can show the operator the manipulation gestures required to achieve a physical task that is necessary to perform the maintenance procedure. These concepts are evaluated, allowing us to provide guidelines for future systems targeting efficient remote collaboration in MR environments.

Keywords: Collaborative Virtual Environment (CVE), Augmented / Mixed Reality, Awareness of collaboration, virtual reality, 3D User Interaction

Received: 28 Sep 2018; Accepted: 10 Oct 2019.

Copyright: © 2019 Le Chénéchal, Duval, Gouranton, Royan and Arnaldi. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

* Correspondence: Dr. Morgan Le Chénéchal, IRT b<>com, Cesson-Sévigné, France, morgan.le.chenechal@gmail.com