Your new experience awaits. Try the new design now and help us make it even better

ORIGINAL RESEARCH article

Front. Robot. AI

Sec. Human-Robot Interaction

This article is part of the Research TopicThe role of Communication and Emotion in Human-Robot Interaction: A Psychological PerspectiveView all articles

Evaluating Human Perceptions of Android Robot Facial Expressions Based on Variations in Instruction Styles

Provisionally accepted
Ayaka  FujiiAyaka Fujii1*Carlos  Toshinori IshiCarlos Toshinori Ishi1,2Kurima  SakaiKurima Sakai1,2Tomo  FunayamaTomo Funayama1,2Ritsuko  IwaiRitsuko Iwai1Yusuke  TakahashiYusuke Takahashi1,3Takatsune  KumadaTakatsune Kumada1,4Takashi  MinatoTakashi Minato1,2
  • 1Guardian Robot Project, Rikagaku Kenkyujo, Wako, Japan
  • 2Hiroshi Ishiguro Laboratories, Kabushiki Kaisha Kokusai Denki Tsushin Kiso Gijutsu Kenkyujo, Soraku District, Japan
  • 3Graduate School of Education, Kyoto Daigaku, Kyoto, Japan
  • 4Graduate School of Informatics, Kyoto Daigaku, Kyoto, Japan

The final, formatted version of the article will be published soon.

Robots that interact with humans are required to express emotions in ways that are appropriate to the context. While most prior research has focused primarily on basic emotions, real-life interactions demand more nuanced expressions. In this study, we extended the expressive capabilities of the android robot Nikola by implementing 63 facial expressions, covering not only complex emotions and physical conditions, but also differences in intensity. At Expo 2025 in Japan, more than 600 participants interacted with Nikola by describing situations in which they wanted the robot to perform facial expressions. The robot inferred emotions using a large language model and performed corresponding facial expressions. Questionnaire responses revealed that participants rated the robot's behavior as more appropriate and emotionally expressive when their instructions were abstract, compared to when they explicitly included emotions or physical states. This suggests that abstract instructions enhance perceived agency in the robot. We also investigated and discussed how impressions towards the robot varied depending on the expressions it performed and the personality traits of participants. This study contributes to the research field of human–robot interaction by demonstrating how adaptive facial expressions, in association with instruction styles, are linked to shaping human perceptions of social robots.

Keywords: human-robot interaction, Facial Expression, Android robot, Emotion attribution, social agency

Received: 20 Oct 2025; Accepted: 28 Nov 2025.

Copyright: © 2025 Fujii, Ishi, Sakai, Funayama, Iwai, Takahashi, Kumada and Minato. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

* Correspondence: Ayaka Fujii

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.