ORIGINAL RESEARCH article
Front. Organ. Psychol.
Sec. Performance and Development
Volume 3 - 2025 | doi: 10.3389/forgp.2025.1669782
This article is part of the Research TopicAffective and Behavioral Dynamics in Human-Technology Interactions of Industry 5.0View all 6 articles
How to Achieve Human-Centered Automation: The Importance of Trust for Safety-critical Behavior and Intention to Use in Human-Robot Collaboration
Provisionally accepted- Chair for Ergonomics and Innovation, Technische Universitat Chemnitz, Chemnitz, Germany
Select one of your emails
You have multiple emails registered with Frontiers:
Notify me on publication
Please enter your email address:
If you already have an account, please login
You don't have a Frontiers account ? You can register here
Recent technological advances in human-robot collaboration (HRC) allow for increased efficiency and flexibility of production in Industry 5.0 while providing a safe workspace. Despite objective safety, research has shown subjective trust in robots to shape the interaction of humans and robots. While antecedents of trust have been broadly examined, empirical studies in HRC investigating the relationship between trust and industry-relevant outcomes are scarce and the importance of trust regarding its precise effects remains unclear. To advance human-centered automation, this paper investigates the affective, cognitive, and behavioral consequences of trust in robots, and explores whether trust mediates the relationship between industry-relevant characteristics and human-centered HRC outcomes. In a pseudo real-world test environment, 48 participants performed a manufacturing task in collaboration with a heavy-load robot. Trust, affective experience over time, intention to use, and safety-critical behavior were examined. A 2x2x2 mixed design varied the availability of feedback, time pressure, and system failures, each expected to affect the level of trust. In the control group, trust remained consistently high across all conditions. System failures and feedback significantly reduced trust, whereas time pressure had no effect. System failures further increased negative affective experience, while feedback reduced safety-critical behavior. Trust was unrelated to affective experience but positively related to safety-critical behavior and intention to use. The relationship between feedback and safety-critical behavior, as well as intention to use, was significantly mediated by trust. Highly relevant for implementation, the control group showed a tendency towards overtrust during collaboration, evidenced by disregarding system failures. The results indicate that implementing a feedback system alongside the simulation of safe system failures has the potential to adjust trust towards a more appropriate level, thereby reducing safety-critical behavior. Based on these findings, the paper posits several implications for the design of HRC and gives directions for further research.
Keywords: Human-robot collaboration, Trust in automation, Affective experience, safety-critical behavior, Intention to use, human factors, manufacturing
Received: 21 Jul 2025; Accepted: 08 Oct 2025.
Copyright: © 2025 Legler, Langer, Dettmann and Bullinger. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
* Correspondence: Franziska Legler, franziska.legler@mb.tu-chemnitz.de
Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.