Your new experience awaits. Try the new design now and help us make it even better

ORIGINAL RESEARCH article

Front. Comput. Sci., 03 September 2025

Sec. Human-Media Interaction

Volume 7 - 2025 | https://doi.org/10.3389/fcomp.2025.1507439

Exploring operator responses to augmented reality training: insights from the SELFEX platform case study

  • 1Design Innovation Center (DBZ), Faculty of Engineering, Mondragon Unibertsitatea, Loramendi, Arrasate, Spain
  • 2MADE S.C.A.R.L., Milan, Italy
  • 3Beko Europe Management SRL, Milan, Italy
  • 4CTAG – Centro Tecnologico de Automocion de Galicia, Porriño, Spain

Traditional industrial training methods often fail to capture the tacit expertise of experienced personnel, limiting instructional quality for new staff. This study examines the SELFEX platform, an augmented reality (AR)–based training system enabling junior operators to learn autonomously by replicating recorded performances of senior operators. Using a mixed-methods design, the research combined a technical analysis of AR’s functionality, benefits, and constraints with an empirical evaluation in an industrial setting. Seventeen participants completed training tasks using either conventional screens or AR headsets, with subjective measures including satisfaction, perceived usefulness, ease of use, and flow state, alongside objective performance metrics. Results showed that AR training was particularly beneficial for novices, enhancing engagement, understanding, and perceived ease of learning, though no statistically significant performance differences with screen-based training were found. Correlation analyses revealed strong links between flow, satisfaction, and ease of learning, highlighting the importance of intuitive, well-integrated design. Challenges in integrating AR into professional workflows—such as technical stability and user adoption—were also identified. These findings position AR as a promising tool for accessible and immersive industrial training, capable of supporting both initial skill acquisition and potential future upskilling. Further longitudinal studies are recommended to evaluate long-term impacts on performance, retention, and cost-effectiveness, and to refine system usability for diverse user profiles.

1 Introduction

In the industrial domain, the expertise and experience of senior operators are crucial in guiding and nurturing the growth of junior operators. According to Johnson et al.’s research (Johnson et al., 2019), it is vital to capture and utilize this implicit knowledge for effectively instructing new staff. Senior operators have valuable tacit knowledge that cannot be easily transferred through traditional training methods, and sometimes these procedures are not well documented. The transmission of such tacit knowledge has a profound impact on both operational efficiency and safety within an organization. Over the years, various methods have been employed to train junior operators, ranging from traditional classroom training to on-the-job training. However, with the advancements in Augmented Reality (AR) and Virtual Reality (VR) technologies, a new wave of training methods has emerged (Doolani et al., 2020). These technologies have the potential to revolutionize the way knowledge is transmitted, and skills are developed in operators. AR-based training has gained significant attention due to its ability to provide real-time interactive guidance and instruction (Garcia Fracaro et al., 2022). These technologies provide immersive environments for simulating real-world scenarios, enabling practice in secure settings. The adoption of AR and VR not only enriches learning but also enables continuous monitoring and feedback for more comprehensive skill enhancement.

2 AR based training

AR-based training has emerged as a promising alternative to traditional methods. AR-based training uses AR technology to overlay digital information onto the real environment, providing learners with a blended experience of virtual and physical elements. AR-based training offers several advantages over traditional methods (Byvaltsev, 2020). First, it allows on-the-job training without disrupting normal operations, as trainees can receive real-time guidance and instructions through AR overlays. Secondly, AR-based training provides a more engaging and immersive learning experience, which increases learner motivation and information retention. The third is that AR technology enables visualization of complex procedures and equipment, which facilitates understanding and practice (Wang et al., 2022).

Trainees wear AR-enabled devices, such as smart glasses or headsets, which display the augmented content. These devices track the trainees’ movements and align the digital information with the physical environment in real-time. This allows trainees to interact with and manipulate the augmented objects, enabling hands-on practice in a virtual representation of the actual task or scenario (Daling and Schlittmeier, 2022). In addition to head tracking, hand tracking is a key component in enabling realistic and intuitive interactions in AR-based training. Two common approaches are vision-based tracking, using RGB or depth cameras, and wearable solutions like sensorized gloves. Camera-based tracking is non-intrusive and convenient, though it may be affected by occlusion or poor lighting conditions. In contrast, gloves offer highly precise data on finger movement and gesture recognition, making them suitable for tasks requiring fine motor skills, albeit with some ergonomic trade-offs. The choice of method depends on the training scenario and required interaction fidelity (Buckingham, 2021).

AR has emerged as a transformative technology in various fields, with industrial environments being no exception. AR’s ability to overlay virtual information onto the real world offers a novel human-machine interaction paradigm that enhances manufacturing and industrial processes (Gavish et al., 2015; Ong et al., 2008). The technology has been successfully applied to military training, surgery, entertainment, and more notably, in maintenance, assembly, product design, and other manufacturing operations (Gavish et al., 2015). Moreover, its integration into training has gained traction across diverse domains, including occupational safety in environments such as laboratories (Ismael et al., 2024) and manufacturing plants (Owen et al., 2024).

The robustness and adaptability of AR systems are critical for their effective deployment in industrial settings, where they must withstand challenging conditions and integrate seamlessly with existing workflows (Ong et al., 2008). AR tools have shown great promise in improving task efficiency and the quality of training, particularly in complex processes such as automotive maintenance (Jetter et al., 2018). The technology’s immersive experiences are not only expected to enhance performance but also to serve as a benchmark for evaluating key performance indicators (KPIs) (Jetter et al., 2018).

Training models for AR learning in industrial contexts have evolved, with systems now providing non-expert users with crucial information about complex automated systems through interactive and intuitive interfaces (Wang and Dunston, 2007). These AR applications are designed to cater to the unique requirements of each industrial process, addressing challenges in design, commissioning, manufacturing, quality control, training, monitoring, and service maintenance (Navab, 2004). In the field of construction, AR has shown potential in operator training, offering an immersive environment where novice operators can interact with virtual materials and instructions in a real worksite setting (Vacchetti et al., 2004). Similarly, AR has been identified as a beneficial technology for assembly and maintenance training, linking location-dependent information directly to physical objects and enabling trainees to practice with real tools, thereby enhancing sensorimotor skills (Heinz et al., 2019).

On the other hand, traditional methods of operator training often involve classroom lectures, reading manuals and procedure documents, and supervised hands-on practice. These methods have limitations in terms of capturing the experiential and tacit knowledge of senior operators. Comparative studies have evaluated the effectiveness of AR against traditional training methods, revealing that AR can lead to fewer errors and potentially better final performance in industrial maintenance and assembly tasks (Nakanishi, 2010). The integration of AR with game-based learning approaches has also been explored, aiming to increase engagement and competence development in operators of industrial production processes (Santos et al., 2022).

The transformative impact of AR in industrial training bridges the gap between traditional methods and innovative learning experiences. AR’s immersive and interactive nature highlighted (Gavish et al., 2015; Lee, 2012), significantly enhances the learning process by simulating real-world scenarios. This approach not only makes complex procedures easier to grasp but also makes the acquisition of tacit knowledge more engaging. The efficacy of AR in training is further enhanced when combined with game-based learning strategies, which capitalize on the natural human propensity for engagement through play, thereby fostering a more profound and enduring competence in industrial tasks. Research by Heinz et al. (2019) and Vidal-Balea et al. (2020) support AR’s role in improving productivity and skill development, indicating its crucial position in the future of industrial training.

Human Factors play a significant role in the successful application of AR-based manuals in industry. Research has shown that AR manuals can lead to faster task completion, fewer errors, and reduced psychological stress compared to traditional paper-based manuals (Webel et al., 2011). In addition to these short-term benefits, the use of AR in industrial training also raises important questions about long-term skill retention and transfer. Studies such as those by Huang (2020) and Daling and Schlittmeier (2022) suggest that immersive, context-aware AR experiences can enhance memory consolidation and facilitate the internalization of procedural knowledge. In this regard, the SELFEX platform offers a novel contribution by combining AR guidance with real-time motion capture and performance evaluation. Unlike traditional video-based instruction, SELFEX enables a dynamic comparison between expert and novice movements, providing quantitative feedback on movement similarity and execution time. These features not only support immediate learning but also promote deeper learning processes, with potential implications for skill transfer in real operational contexts. However, the practical use of AR manuals necessitates a clear understanding of human factor requirements, including the characteristics of workers, the work environment, and the presentation of information through head-mounted displays (HMDs) (Webel et al., 2011).

2.1 Assessing training experience

In terms of the complexity of training evaluation, extends beyond surface data, necessitating a deeper understanding of the program’s structure and its impact on behavior, rather than mere attitudes (Turnbull et al., 1998). In-training evaluation (ITE) further enriches this landscape by documenting the ongoing performance of learners in real-world settings, thereby providing a more nuanced view of competency development and essential practice behaviors (Brown, 2002).

The challenge of evaluating training programs is not only to monitor and assess but also to ensure that the impact of such programs is substantial and aligns with organizational goals (Ritzmann et al., 2014). This is where the Flow State Scale and the Training Evaluation Inventory (TEI) come into play, offering structured approaches to measure the psychological state of flow experienced by participants and the outcomes of training design, respectively (Grohmann and Kauffeld, 2013; Light, 1979). The Flow State Scale (FSS), despite its complex factor structure, provides insight into the subjective experience of being ‘in the zone’ during physical activity, which can be extrapolated to other training contexts (Grohmann and Kauffeld, 2013). Meanwhile, the TEI serves as a comprehensive tool that evaluates training design and predicts training success, emphasizing the importance of demonstration, application, and integration in training programs (Light, 1979).

The literature also underscores the significance of user evaluations, such as the USE (User Satisfaction Evaluation) Questionnaire (USE Questionnaire, 2025), which, despite their subjective nature, can serve as surrogates for objective performance measures in information systems training (Huang, 2020). Job Training Satisfaction (JTS) is another critical subjective measure that influences job satisfaction and performance, highlighting the importance of well-designed training activities (Kaur et al., 2020). Moreover, the development of psychometrically sound evaluation measures, such as the Questionnaire for Professional Training Evaluation, ensures that training contributions to organizational success are examined reliably and efficiently (Vlachopoulos et al., 2000).

Training needs assessment is an indispensable precursor to effective training, ensuring that the program addresses the actual needs of the organization and its employees (Grohmann and Kauffeld, 2013). Without it, organizations risk misdirecting their training efforts, which can lead to suboptimal outcomes. Furthermore, the integration of objective competency-based assessment methods, such as written examinations and OSCEs, complements performance-based methods, offering a more comprehensive evaluation of training effectiveness (Brown, 2002).

The interplay between objective and subjective assessments is further complicated by the role of various observers, including peers, participants, and self-evaluations, each providing unique perspectives on the trainee’s performance (Brown, 2002). The dynamic nature of training evaluation calls for a balance between these different methods to capture a holistic picture of training success and understanding Human Factors related aspects that cannot be underestimated.

2.2 Aim of the study and hypotheses

SELFEX is an innovative operator training platform designed to leverage the expertise of senior operators and the capabilities of AR technology to facilitate knowledge transfer to junior operators. The platform operates on a simple yet effective principle: senior operators, equipped with movement-recording gloves from MAGOS gloves (Quanta and Qualia Ltd., 2020), perform tasks in an optimal manner. This performance is captured by the platform’s software, making it accessible for junior operators to train independently, without the need for direct supervision by their senior counterparts.

Upon engaging with SELFEX, junior operators utilize the same gloves to undergo training. This setup allows them to practice without the physical presence of senior operators, as the required procedural steps have already been documented on the platform. Training is offered in two distinct modes: through a conventional screen display or via HoloLens 2 (Microsoft Corporation, 2019) glasses. The former option presents the expert’s hand movements on a screen, while the latter overlays these movements onto the physical workbench environment, offering a more immersive training experience. The objective for junior operators is to replicate the senior operator’s movements with as much precision as possible.

To quantify training effectiveness, SELFEX incorporates an algorithm that evaluates the congruence between the movements of junior and senior operators, providing scores based on time efficiency and movement similarity, while also enabling senior operators to set a minimum performance threshold that junior operators must meet or exceed to be deemed proficient in the task at hand. These parameters have proven especially relevant in tasks where specific techniques are critical and have been refined by expert operators over years of practice. For example, in painting processes, where hand motion directly influences the evenness of paint distribution, or in assembly sequences, where the order and position of the hands are essential for successful execution. Time efficiency, in turn, is also considered a key factor to ensure that learned techniques do not compromise the overall productivity of the process. This innovative approach to operator training prompts the exploration of two critical research questions, with the first focusing on the impact of SELFEX on junior operators’ perceptions of training satisfaction, perceived usefulness, and ease of use, especially when compared to conventional training methods. This line of inquiry seeks to uncover the extent to which the integration of extended reality wearable technology in platforms like SELFEX can affect subjective assessments of the training experience.

The second research question delves into the efficacy and efficiency of SELFEX-assisted training versus conventional methods in enhancing the performance of junior operators. This encompasses an evaluation of both objective competency-based assessments and subjective measures of JTS. Through these inquiries, the study seeks to establish a comprehensive understanding of the potential benefits and limitations of integrating advanced training technologies such as SELFEX in manual assembly tasks, thereby contributing valuable insights to the field of user interaction and training technology.

3 Research materials and methods

This section outlines the methodology employed in the study, including the selection and preparation of research materials, participant information and the detailed procedural steps undertaken to assess the effectiveness of the SELFEX platform in operator training. It encompasses the initial briefing and consent process, the specific training procedures involving the use of advanced wearable technology, and the systematic collection and analysis of both objective performance data and subjective participant feedback.

3.1 Materials

The SELFEX cabin system is conceived as a mobile, all-inclusive environment furnished with critical components such as trackers, finger tracking gloves, cameras, and computing units. It integrates a software platform that offers functionalities for recording and testing operations, empowering factories toward complete operational independence. The innovation lies in enabling the creation of custom content through the straightforward capture of experienced operators’ movements, thus mitigating the need for expensive external content creation services. An illustration of this portable cabin concept is presented in Figure 1.

Figure 1
SELFEX cabin: A glass-walled room equipped with technology for self-training using extended reality wearable technology. A blue banner reads “SELF-TRAINING USING EXTENDED REALITY WEARABLE TECHNOLOGY” and includes various logos of partners. The cabin contains screens and technical equipment where operators train, and is located in an industrial setting.

Figure 1. Selfex modular cabin for immersive operator training.

The cabin’s design ensures a plug-and-play system—that is, it can be transported with all elements in place so that, once the cabin is positioned, the system can be activated, and training can start within. However, in this particular study, the cabin was not used for various reasons, including the spatial limitations and the need for better visibility during testing and for providing assistance to participants, as the cabin walls hindered real-time observation of participants. It is important to note that the cabin itself represents a future development goal, intended to house the system in a modular and deployable format. Nonetheless, the core components of the SELFEX platform (namely the hand-tracking system and the software for performance capture and feedback) were fully operational and deployed in this study. As such, the experimental setup preserved the essential technological functionality of SELFEX.

Specifically, this use case pertains to a configuration for the company Whirlpool EMEA. The training workstation, envisioned as a cabinless solution, incorporated several essential components. Central to the setup was a workbench designed to accommodate the demonstration products, which included an oven and various gaskets for Whirlpool EMEA’s demonstrations, as well as containers for pump parts for on-site demonstrations.

Inside the cabin, the main activity is organized around the bench that can be seen in Figure 2. This bench (1) was equipped with a robust metal structure designed to support a camera and sensors (4) for the glove antennas (3).

Figure 2
An training workspace featuring a person wearing extended reality equipment, including a headset and movement tracking gloves. The SELFEX setting includes a table with tools, an open appliance, multiple screens, and a large monitor. Various components are labeled numerically for organization.

Figure 2. Components of the SELFEX training station.

Adjacent to the workbench, a rack (2) was installed to house an industrial PC, which managed the workstation’s processing and control tasks. Above this setup, a camera was positioned to capture task execution from an overhead perspective. The MAGOS sensorized gloves (5), used by participants, combine flex sensors to detect individual finger movements with an HTC Vive Tracker (HTC Corporation, 2018) mounted on the back of the hand to capture the global position and orientation of the hand. Two antennas ensured proper glove connectivity, while SteamVR base stations (HTC Corporation, 2019) enabled accurate spatial tracking throughout the environment. The system records the time-stamped movement trajectories of both the expert and the trainee, capturing hand and finger positions frame by frame. A dedicated algorithm processes these trajectories by computing relative distances between expert and trainee performances over time, generating a similarity score that reflects the accuracy and fidelity of the trainee’s replication. This metric provides immediate, objective feedback on performance quality. Participants received AR guidance via a HoloLens 2 headset (6), while two monitors (7) facilitated system interaction, real-time monitoring, and control of the SELFEX interface.

Finally, the setup included a large TV display (8) that was utilized for managing the HoloLens 2 (Microsoft Corporation, 2019), enhancing the AR visualization. This comprehensive arrangement of equipment was carefully designed to facilitate an intuitive and immersive training experience for the users.

3.2 Procedure

The procedure followed was as such. Initially, participants were informed about the aim of the study, and their consent was requested for the use of their data for subsequent analysis. They were also advised that they could withdraw from the process at any time if they chose to. Participants were also asked to fill in socio-demographic data (Apraiz et al., 2023). Before beginning the tasks, the study protocol was explained both orally and in writing, including the type of data to be collected, the structure of the activity, and the tools to be used. All doubts or questions raised by participants were clarified to ensure full understanding. Written informed consent was obtained from each participant, and the study was approved by The Research Ethics Committee of Mondragon Unibertsitatea.

First, the participants were fitted with gloves and shown a video of virtual hands performing the task while it was explained to them either in the HoloLens 2 (Microsoft Corporation, 2019) or the screen. Subsequently, they were instructed to carry out the task by following the hands, either by looking at the screen or, in the case of the HoloLens 2 (Microsoft Corporation, 2019), with the glasses on. For the algorithm comparison to be successful, participants were required to start by placing their hands on a marker on the table. This marker also served as the starting point for the senior operator, thus allowing for a more straightforward comparison of both trajectories later. To ensure training validity, the training setup was designed to closely replicate real-world industrial conditions. Participants interacted with actual components used in manufacturing, including the oven, rubber and other types of tools. The workstation layout mimicked typical factory environments, with pieces positioning and body posture.

After each test, participants were asked to fill in the Flow State Scale (García Calvo et al., 2008) and USE (User Satisfaction Evaluation) Questionnaire (USE Questionnaire, 2025), and finally, they were asked to fill in four general questions about the integration and use of the gloves. The questions are as follows: 1. The screen/HoloLens 2 (Microsoft Corporation, 2019) and the gloves fit and feel good. 2. It was easy to interact with the AR solution using the sensorized gloves. 3. The AR solution integrated well with the manufacturing environment. 4. In my opinion, the AR solution has had a positive impact on my productivity and efficiency.

This methodology not only ensures the capture of objective data related to task execution but also subjective data on the participants’ experience and satisfaction. The use of these questionnaires is crucial for understanding the psychological and ergonomic impact of AR technology in operational settings. The Flow State Scale measures the participant’s level of immersion and engagement in the task, while the USE Questionnaire evaluates the user’s satisfaction with the technology. This dual approach provides a holistic view of both the efficacy and the user experience of the AR application, making it possible to assess not only the functional aspects of the technology but also its usability and acceptance among users. Personal opinions were also noted in the questionnaires.

About the tasks, two types of tasks have been tested. Both consist of the assembly of manufacturing elements. The first is an assembly of the insulating rubber of a furnace, while the second involves an assembly of an oil & gas valve composed of multiple components in the specific order. As a summary, Figure 3 shows the procedure followed.

Figure 3
Flowchart depicting a study process. It starts with a social-demographic questionnaire for five participants, followed by randomization. Participants complete either a screen test or AR test, then a use and flow state questionnaire. The process includes steps for comparative analysis between augmented reality (AR) and screen, or evaluation of factors in AR. Another group of twelve participants follows a similar path, concluding with a use and flow state questionnaire.

Figure 3. Procedure followed during the experiment.

3.3 Participants

Regarding the participants for the comparison of screen versus Augmented Reality (AR) results, the sample consisted of 5 subjects, with a distribution of 40% men and 60% women. The evaluation of AR included these 5 subjects, as well as the results from an additional 12 individuals, totaling 17 participants (53% men and 47% women).

In terms of participant statistics, 24% of the participants used glasses in conjunction with the HoloLens 2 (Microsoft Corporation, 2019). With respect to task familiarity, 29% had previously engaged in assembly tasks at work, with 3 of these individuals being professional assemblers. Another 24% had performed an assembly task at least once, while 47% had never engaged in assembly tasks before.

As for familiarity with the technology, 59% of participants had interacted with AR glasses at least once. About 29% had never interacted with the glasses but were familiar with the technology. A minority of 12% were not familiar with the technology and had not interacted with it until this point.

The average age of the participants was 28.6 years. This demographic breakdown provides a glimpse into the diverse range of experiences and backgrounds that the participants brought to the study, potentially influencing their interaction with AR technology and their performance in assembly tasks. The blend of prior experience with technology and tasks can offer valuable insights into the adaptability and learning curve associated with the use of AR in industrial settings. The age average suggests a relatively young cohort, which may correlate with a higher propensity for technological adaptability and learning. However, the inclusion of participants with no prior exposure to AR glasses or assembly tasks also ensures a robust test of the technology’s intuitiveness and the efficacy of AR in training scenarios for novice users.

4 Results

In this section, we present the findings derived from the study investigating training platforms that leverage AR technology for manual assembly tasks. Through rigorous analysis of objective performance metrics and subjective user feedback, we uncover insights into SELFEX’s efficacy and user experience among operators with no or little knowledge. The findings highlight its potential to enhance training effectiveness and user satisfaction in industrial settings. Furthermore, the diverse demographic profile of participants enriches our understanding of SELFEX’s applicability across varied user backgrounds. Our results underscore the transformative impact of AR technology on training paradigms, while also illuminating the intricacies of human-technology interaction in operational contexts. Further information regarding the gathered data can be found in Escallada et al., 2024.

4.1 Comparison of training configurations: performance and overall experience

Statistical analysis comparing the performance metrics of AR and screen tool revealed no significant differences in the flow state, integration, usefulness, ease of use, ease of learning, and satisfaction among users. To explore the relationships among user experience dimensions in the AR condition, Spearman rank correlation coefficients were calculated, as this approach is well-suited for ordinal data such as Likert-scale questionnaire responses and appropriate for small sample sizes. Although group comparisons (e.g., between AR and screen conditions, or across experience levels) are reported descriptively, they are grounded in observed patterns rather than formal hypothesis testing. This decision was made given the exploratory nature of this study and the limited sample size (n = 17), which constrains statistical power. As such, the reported trends offer meaningful insights into user perceptions while also highlighting areas for further investigation with larger participant groups.

From a descriptive perspective, differences in AR vs. screen preferences were observed depending on participants’ task experience. Participants with no technical experience (‘Never’) might find AR tools more intuitive or appealing, due to higher scores on flow, usefulness and satisfaction. Also ease of learning scores better, indicating that AR tools are perceived as easier to learn by those without specific technical experience. Participants with “some assembly” technical experience, generally provided lower ratings for AR tools with flow, usefulness and satisfaction. Regarding ease of learning, participants with technical experience find AR tools slightly less straightforward to learn than their non-technical counterparts. Figure 4 shows the ratings in a scale out of 7.

Figure 4
Bar charts comparing AR and Screen Tool ratings based on technical experience. Categories include Flow, Integration, Usefulness, Ease of Use, Ease of Learning, and Satisfaction. Two groups: Never Technical Experience and Some Assembly Experience. Scores range from 0 to 7.

Figure 4. Technology tool ratings by technical experience.

The combined analysis reveals notable gender differences across both conditions. As Figure 5 shows, female participants rated the AR tool consistently higher than males, particularly in ease of learning and satisfaction, where the gap is most pronounced. Conversely, in the screen-based condition, male participants reported higher scores than females in usefulness, ease of use, and satisfaction. Interestingly, while females generally favored the AR experience, males appeared to find the screen interface more usable and satisfying, suggesting that gender may influence perceived effectiveness depending on the technological interface.

Figure 5
Bar chart titled

Figure 5. Technology tool ratings by gender.

4.2 AR training environments and experience-related factors

For this second data analysis, there were 17 participants (9 men and 8 women). These participants performed the task with the AR solution without testing the screen solution. In this section we analyze the results of the correlation among factors as shown in the Spearman correlation matrix in Figure 6, which explores associations among variables like satisfaction, ease of learning, and flow.

Figure 6
Correlation matrix for AR participants shows relationships among flow, integration, usefulness, ease of use, ease of learning, and satisfaction. Values range from zero point five two to one, with color coding from dark purple to yellow indicating strength.

Figure 6. Spearman correlation matrix between subjective evaluation factors for AR participants (n = 17).

The dataset indicates a robust relationship between flow and all other assessed factors, with particularly strong connections to satisfaction and ease of learning. This suggests that when users experience a state of flow in the AR environment characterized by deep immersion and engagement it positively influences not only their overall satisfaction but also the ease with which they can master the technology. Integration is closely linked with perceptions of usefulness and ease of use. This implies that seamless integration of AR into users’ tasks enhances the technology’s perceived practicality and straightforwardness. Essentially, when users find AR well-integrated into their workflow, they are more likely to consider the technology beneficial and straightforward to operate.

These relationships are visualized in the Spearman correlation matrix presented in Figure 6. The strongest associations are observed between flow and both satisfaction and ease of learning, indicating that participants who reported being more immersed in the task also found the training experience more enjoyable and easier to learn. Ease of use also shows a high correlation with ease of learning and satisfaction, suggesting that when the AR system is perceived as intuitive, it facilitates a more satisfying and effective training process. Furthermore, integration and usefulness are closely linked, which highlights the importance of seamless incorporation of AR into the task environment (when the technology fits naturally into the workflow, it is also seen as more valuable). Overall, these interrelations emphasize the critical role of design simplicity, contextual relevance, and user engagement in shaping the perceived success of AR-based training systems. These interrelations emphasize the critical role of design simplicity, contextual relevance, and user engagement in shaping the perceived success of AR-based training systems. Notably, the close relationship between ease of learning and satisfaction reinforces the idea that intuitive design and user onboarding are not merely usability enhancements, but essential factors in promoting a positive and rewarding learning experience.

Beyond the global correlations, a closer look to the participant subgroups reveal that those participants who have no technical experience (‘Never’) are generally more receptive to AR. Their lack of familiarity with technical systems may mean they approach AR without preconceived expectations, allowing them to evaluate the technology on its own merits. This group’s high scores across various metrics, particularly in ease of use and learning, imply that AR systems are intuitively designed, making them accessible to users regardless of their technical background. The elevated levels of satisfaction observed in this cohort suggest that AR could be a particularly beneficial tool in enhancing the professional capabilities of those who may not have extensive experience with similar technologies.

On the other hand, individuals with some technical experience (‘Some Assembly’) exhibit more moderate views on AR’s integration into their work processes, with scores hovering around the midpoint of the scale. This could indicate either a cautious optimism about the technology or a clear recognition of areas where AR could be improved to better suit their workflows. Their lower ratings for AR’s usefulness might reflect a gap between their expectations, based on their technical experience, and the actual performance of the AR systems they used. The moderate scores for ease of use and learning might suggest these participants face a learning curve or encounter usability issues that aren’t fully in sync with their technical skills. As a result, their satisfaction levels are also moderate, which could indicate that while they find some value in AR, it may not fully align with or exceed their existing technical capabilities.

Participants categorized as having ‘Expert’ technical experience appear to have a favorable view of the integration of AR within their professional activities, which is consistent with their ability to integrate more complex technologies into their work. Interestingly, while they rate AR’s usefulness positively, it’s not the highest score, perhaps suggesting that while they see AR as beneficial, it does not substantially surpass the existing tools and methods with which they are already proficient. Their expertise likely contributes to the high scores in ease of use and learning, suggesting that their technical background enables them to quickly learn and effectively use AR systems. This group’s high satisfaction ratings may be due to their ability to leverage their technical knowledge to maximize the benefits offered by AR technology.

5 Discussion

The integration of AR technology into training platforms, as examined through the SELFEX platform in manual assembly tasks, unveils a multifaceted landscape of user experience research. This landscape is characterized by a complex interplay between user satisfaction, technology acceptance, and the nuanced outcomes of AR versus traditional training methodologies. Such complexity goes beyond mere performance metrics, delving into user preferences, cost implications, and situational applicability, thereby requiring a nuanced consideration of AR and screen tool’s contextual relevance.

Empirical findings from our study and the broader domain of user experience research suggest that engagement with an AR environment, manifested through the psychological state of flow, serves as a critical predictor of the technology’s acceptance and integration into routine practices. This is not merely an isolated metric but a comprehensive indicator of a user’s inclination toward AR, often mediated by the system’s perceived usefulness in enhancing task efficiency and operational effectiveness. Consequently, this fosters user satisfaction and a deeper integration of AR into their workflow, affirming the first hypothesis that AR enhances subjective measures of training experience for novices.

The relationship between perceived usefulness and ease of use plays a pivotal role in the adoption and effectiveness of AR technology, highlighting the importance of creating AR systems that are both beneficial and user-friendly to foster further engagement. This principle is crucial for designing intuitive AR experiences that meet users’ needs and align with the technology’s ability to make complex manual tasks more accessible, particularly for those without prior technical experience. Additionally, the ease of learning emerges as a significant factor influencing users’ initial interactions with AR, suggesting that systems which can be quickly understood and mastered are more likely to achieve higher adoption rates and sustained usage. Together, these insights underscore the need for AR training tools to be accessible and straightforward, ensuring that all users, regardless of their technical background, can benefit from the enhanced learning experiences AR technology offers.

This study’s insights into technical expertise reveal a spectrum of AR perception across different user groups. Individuals with limited or no technical experience report significant benefits from AR training, reflecting the technology’s potential to make learning more engaging and intuitive for novices. On the other hand, participants with intermediate technical skills did not demonstrate a strong preference for AR over traditional methods, suggesting that the value of AR may be more pronounced for those at the initial stages of learning a new skill. Beyond short-term training outcomes, it is also important to consider AR’s potential for long-term skill retention and transfer in industrial contexts. Prior literature [e.g., Huang (2020) and Daling and Schlittmeier (2022)] suggests that immersive and feedback-driven learning environments (like those enabled by AR) can facilitate the internalization of procedural knowledge.

However, the reluctance to fully integrate AR systems into professional workflows highlights the challenges of adopting new technologies. This reluctance, reflected in the integration scores, underscores the need for targeted efforts in user education, system customization, and improving interoperability to bridge the gap between AR’s potential and its practical application.

In weaving together these observations, the discussion underscores the intricate dynamics at play in the adoption and impact of AR technology on training paradigms. The nuanced outcomes observed, particularly the differentiated impact based on users’ prior technical experience, suggest that AR’s benefits are most significant for novices, offering a promising avenue for making complex tasks more accessible.

5.1 Limitations

Despite extracting valuable insights during this testing, it is imperative to note that certain limitations have been identified, which should be considered when interpreting the results.

Firstly, the solution is at an early stage of development. Both hardware and software are still maturing, which led to occasional technical malfunctions during testing sessions. These included glitches in glove calibration, intermittent connection losses with the HMD and delays in the visualization of virtual hands. Some participants reported frustration or confusion due to these issues, which likely influenced their evaluations of the system’s ease of use and integration. Although external to the conceptual design, these instabilities must be addressed through enhanced robustness and improved system feedback to ensure reliability in future iterations.

Secondly, there are limitations related to the testing environment. Since the sessions were conducted outside of the intended enclosed cabin setup to allow for better observation, lighting and spatial constraints may have introduced variability in user perception and interaction. Additionally, the novelty of the setup and the artificial nature of the testing context may have affected participant behavior, potentially increasing cognitive load or altering their engagement compared to a real industrial setting.

Thirdly, participant familiarity with technology introduces a potential source of bias. As the results suggest, participants with no prior experience using AR tools tended to rate the system more positively, possibly reflecting a novelty effect or lower expectations. Conversely, more experienced users may have applied stricter usability standards or evaluated the system against prior benchmarks, resulting in more critical assessments.

Beyond familiarity, the novelty of wearing sensorized gloves and using immersive AR glasses for the first time may have influenced both performance and subjective responses (either positively, due to initial excitement and engagement or negatively, due to discomfort, disorientation or cognitive overload). These novelty-related effects, although difficult to quantify, should be considered when interpreting early-stage user feedback. Future studies could mitigate this by incorporating familiarization sessions or by comparing responses across multiple exposures to distinguish first-impression bias from sustained usability perceptions.

To mitigate the impact of novelty effects and technical limitations in future studies, several strategies are recommended. First, integrating familiarization or acclimation sessions before formal testing could help participants adjust to the AR interface and sensorized gloves, reducing the influence of initial surprise or discomfort on their responses. Second, conducting repeated-use evaluations over time would allow for differentiation between first-impression reactions and stable user experience assessments. On the technical side, implementing real-time logging of system malfunctions, response delays, or calibration drift would make it possible to correlate these disruptions with subjective feedback or performance metrics. Finally, iterative hardware and software refinements should be guided by both technical feedback and ergonomic observations gathered during early trials, ensuring that usability improves in parallel with functionality.

Studies have shown that a small number of participants can uncover a large share of usability issues, with some suggesting that five users may identify up to 80% of problems (Virzi, 1992; Lewis, 1994), but the limited sample size in this study (n = 17) constrains the generalizability of the findings. The diversity of user profiles helped identify trends across experience levels, but in complex systems like AR, outcomes are shaped by intertwined factors such as system design, task structure, and user background. As an exploratory case study, this research provides meaningful insights into the feasibility and perceived usability of the SELFEX platform. However, its results should be interpreted as indicative rather than conclusive. Larger and more stratified participant samples will be needed to validate the observed patterns and enable stronger statistical inferences.

Notably, while participants reported high levels of satisfaction and engagement with the AR-based training (especially those with less technical experience) these positive subjective impressions were not accompanied by statistically significant performance gains compared to screen-based training. This discrepancy suggests that the primary value of AR in this context may reside in enhancing motivation and perceived ease, rather than immediate task efficiency. The immersive qualities of AR may promote confidence and affective involvement without yet translating into measurable improvements. However, it is plausible that increased satisfaction and motivation could contribute to performance improvements over time, particularly in real-world industrial scenarios where training is progressive and iterative. These delayed effects may not be captured within the scope of a single-session experimental design and should be examined in longitudinal studies.

Moreover, the short exposure time and technical constraints of an early-stage prototype may have limited the full realization of AR’s potential. These findings highlight the importance of combining subjective and objective metrics in future assessments and point to the need for longer-term studies to evaluate whether increased engagement contributes to sustained or transferable skill development.

6 Conclusion

In synthesizing the insights derived from this research, it becomes evident that the study significantly contributes to our understanding of AR’s role in enhancing training experiences, particularly within the context of manual assembly tasks. The investigation’s rigorous examination of AR’s impact on user satisfaction, perceived usefulness, ease of use, and learning efficiency offers a comprehensive understanding of how such emerging technologies can be leveraged to improve educational and operational outcomes.

While limited in scale, the study’s findings underscore the critical interplay between perceived usefulness and ease of use in determining the success of AR training platforms. By demonstrating that AR systems are particularly beneficial for users lacking prior technical experience, the research highlights the potential of AR to democratize access to complex tasks, making them more approachable and engaging for novices. This observation is pivotal, reinforcing the necessity for AR training solutions to be intuitively designed and user-centric, ensuring they cater to the diverse needs and capabilities of their intended audiences.

Although AR systems were rated more favorably in terms of user experience, objective performance differences compared to screen-based training were not statistically significant in this study. Furthermore, the emphasis on the ease of learning as a determinant of AR technology adoption is a critical insight, suggesting that the ability of users to quickly grasp and utilize these systems is paramount for their widespread acceptance and use. This finding not only validates the importance of user-friendly design in the development of AR training tools but also challenges the field to prioritize the creation of accessible, efficient learning environments that can cater to a broad spectrum of users.

However, the research also prompts a critical reflection on the limitations and future directions of AR in training. While the study showcases the subjective benefits of AR, such as increased satisfaction and engagement, it also reveals the need for a deeper investigation into the long-term impacts of AR on performance and skill acquisition. The lack of significant differences in objective performance outcomes between AR and traditional training methods suggests a complex relationship between technology use and learning effectiveness, warranting further exploration. In this regard, future studies should also consider conducting cost–benefit analyzes comparing AR-based training with traditional instructional approaches, which would be particularly relevant for industrial decision-makers assessing implementation feasibility. Moreover, it is important to evaluate whether this type of technology is appropriate for the specific characteristics of the training task itself. Depending on the nature of the activity, traditional methods may still prove more effective or efficient, while in other cases, AR may offer distinct advantages.

Additionally, while this study focused primarily on novice operators, the potential of AR platforms like SELFEX extends beyond initial training. Given AR’s capacity to deliver up-to-date, contextualized guidance, such systems could also serve as effective tools for retraining and upskilling experienced personnel, especially in dynamic industrial environments where processes and tools evolve rapidly.

This research reinforces the value of AR technology as a transformative tool for training and education, particularly for users new to technical domains. The findings point to the importance of designing AR systems that prioritize ease of use, learning efficiency, and the democratization of access to technical knowledge. However, these benefits must be considered regarding the platform’s current developmental stage. As the SELFEX system remains an early-stage prototype, the results primarily reflect user impressions of a proof-of-concept rather than a finalized industrial solution. Conclusions regarding usability, satisfaction, and effectiveness should therefore be interpreted as formative insights to inform ongoing refinement and future testing.

Notably, participants (especially those with limited technical experience) reported higher satisfaction and lower learning effort when using the AR-based training system. Yet, these subjective benefits were not matched by statistically significant improvements in objective performance when compared to screen-based training. This discrepancy suggests that AR’s added value may lie more in promoting engagement and perceived mastery than in immediate efficiency gains. Further longitudinal studies with larger and more demographically diverse samples, will be essential to determine whether increased motivation and immersion ultimately translate into better learning transfer or operational outcomes. Expanding the sample size will not only increase statistical power but also help isolate the influence of user characteristics such as prior technical experience, age, or gender on training outcomes.

Overall, this study contributes valuable empirical and design-oriented insights into the deployment of AR in industrial learning contexts. By combining subjective user experience data with objective performance indicators and highlighting differences across experience levels, it lays the groundwork for future research aimed at developing robust, human-centered AR solutions that support evolving educational and operational needs.

Data availability statement

The datasets presented in this study can be found in online repositories. The names of the repository/repositories and accession number(s) can be found at: https://data.mendeley.com/datasets/tvcxfhxpnz/1.

Ethics statement

The studies involving humans were approved by Mondragon Unibertsitatea Research Ethics Committee. The studies were conducted in accordance with the local legislation and institutional requirements. The participants provided their written informed consent to participate in this study. Written informed consent was obtained from the individual(s) for the publication of any potentially identifiable images or data included in this article.

Author contributions

OE: Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Visualization, Writing – original draft, Writing – review & editing, Validation. GL: Conceptualization, Formal analysis, Funding acquisition, Methodology, Supervision, Validation, Writing – original draft, Writing – review & editing, Investigation. MM: Conceptualization, Formal analysis, Investigation, Methodology, Validation, Writing – original draft, Writing – review & editing. DC: Data curation, Investigation, Resources, Validation, Writing – original draft, Writing – review & editing. EB: Data curation, Investigation, Resources, Validation, Writing – original draft, Writing – review & editing. AD-N: Funding acquisition, Resources, Supervision, Validation, Writing – original draft, Writing – review & editing. MG: Conceptualization, Data curation, Funding acquisition, Project administration, Resources, Supervision, Validation, Visualization, Writing – original draft, Writing – review & editing.

Funding

The author(s) declare that financial support was received for the research and/or publication of this article. This work was funded by EIT Manufacturing in the 2023 Innovation call (Project ID: 23134, Activity Leader: MG). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Acknowledgments

The project SELFEX was developed by a consortium formed by CTAG – Automotive Technology Center of Galicia, Oesía, Quanta&Qualia, MADE-CC and Whirlpool EMEA. The authors want to acknowledge the collaboration of the whole SELFEX team for their support during the development of this work. Additionally, the authors want to acknowledge the co-funding received by the project SELFEX from the European Institute of Technology (EIT Manufacturing) during the BP2023, under the Horizon 2020 research and innovation program and also the EIT Manufacturing Doctoral School.

Conflict of interest

EB was employed by Beko Europe Management SRL. DC was employed by MADE S.C.A.R.L.

The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Generative AI statement

The authors declare that Gen AI was used in the creation of this manuscript. The authors acknowledge the use of ChatGPT version 4.0 by OpenAI for writing assistance. The authors have factual checked the outputs and are responsible for the content and any errors.

Publisher’s note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

References

Apraiz, A., Lasa, G., Montagna, F., Blandino, G., Triviño-Tonato, E., and Dacal-Nieto, A. (2023). An experimental protocol for human stress investigation in manufacturing contexts: its application in the NO-STRESS project. Systems 11:448. doi: 10.3390/systems11090448

Crossref Full Text | Google Scholar

Brown, J. (2002). Training needs assessment: a must for developing an effective training program. Public Pers. Manag. 31, 569–578. doi: 10.1177/009102600203100412

Crossref Full Text | Google Scholar

Buckingham, G. (2021). Hand tracking for immersive virtual reality: opportunities and challenges. Front. Virtual Real. 2:8461. doi: 10.3389/frvir.2021.728461

Crossref Full Text | Google Scholar

Byvaltsev, S. V., “Review of the features of augmented reality application in the training of operators and maintenance staff,” in IOP conference series: materials science and engineering, Bristol: IOP Publishing Ltd., (2020)

Google Scholar

Daling, L. M., and Schlittmeier, S. J. (2022). Effects of augmented reality-, virtual reality-, and mixed reality-based training on objective performance measures and subjective evaluations in manual assembly tasks: a scoping review. Hum. Factors 66, 589–626. doi: 10.1177/00187208221105135

PubMed Abstract | Crossref Full Text | Google Scholar

Doolani, S., Owens, L., Wessels, C., and Makedon, F. (2020). Vis: an immersive virtual storytelling system for vocational training. Appl. Sci. 10, 1–15. doi: 10.3390/app10228143

PubMed Abstract | Crossref Full Text | Google Scholar

Escallada, O., Lasa, G., Mazmela, M., Bosani, E., La Carrubba, D., Dacal-Nieto, A., et al., (2024), “Datasets on flow state evaluation, USE questionnaire, and motion-tracking glove integration in SELFEX: an AR and screen-guided training solution” Available at: https://data.mendeley.com/datasets/tvcxfhxpnz/1 (Accessed September 15, 2024).

Google Scholar

García Calvo, T., Jiménez Castuera, R., Santos-Rosa Ruano, F. J., Reina Vaíllo, R., and Cervelló Gimeno, E. (2008). Psychometric properties of the Spanish version of the flow state scale. Span. J. Psychol. 11, 660–669. doi: 10.1017/S1138741600004662

PubMed Abstract | Crossref Full Text | Google Scholar

Garcia Fracaro, S., Glassey, J., Bernaerts, K., and Wilk, M. (2022). Immersive technologies for the training of operators in the process industry: a systematic literature review. Netherlands: Elsevier.

Google Scholar

Gavish, N., Gutiérrez, T., Webel, S., Rodríguez, J., Peveri, M., Bockholt, U., et al. (2015). Evaluating virtual reality and augmented reality training for industrial maintenance and assembly tasks. Interact. Learn. Environ. 23, 778–798. doi: 10.1080/10494820.2013.815221

Crossref Full Text | Google Scholar

Grohmann, A., and Kauffeld, S. (2013). Evaluating training programs: development and correlates of the questionnaire for professional training evaluation. Int. J. Train. Dev. 17, 135–155. doi: 10.1111/ijtd.12005

Crossref Full Text | Google Scholar

Heinz, M., Büttner, S., and Röcker, C., “Exploring training modes for industrial augmented reality learning,” in ACM International Conference Proceeding Series, New York: Association for Computing Machinery, (2019), pp. 398–401.

Google Scholar

HTC Corporation (2018). VIVE Tracker (3.0). Taoyuan, China: HTC Corporation.

Google Scholar

HTC Corporation (2019). SteamVR Base station 2.0. Taoyuan, China: HTC Corporation.

Google Scholar

Huang, W.-R. (2020). “Job training satisfaction, job satisfaction, and job performance” in Career development and job satisfaction. ed. W.-R. Huang (London: IntechOpen).

Google Scholar

Ismael, M., McCall, R., McGee, F., Belkacem, I., Stefas, M., Baixauli, J., et al. (2024). Acceptance of augmented reality for laboratory safety training: methodology and an evaluation study. Front. Virtual Real. 5:2543. doi: 10.3389/frvir.2024.1322543

Crossref Full Text | Google Scholar

Jetter, J., Eimecke, J., and Rese, A. (2018). Augmented reality tools for industrial applications: what are potential key performance indicators and who benefits? Comput. Hum. Behav. 87, 18–33. doi: 10.1016/j.chb.2018.04.054

Crossref Full Text | Google Scholar

Johnson, T. L., Fletcher, S. R., Baker, W., and Charles, R. L. (2019). How and why we need to capture tacit knowledge in manufacturing: case studies of visual inspection. Appl. Ergon. 74, 1–9. doi: 10.1016/j.apergo.2018.07.016

Crossref Full Text | Google Scholar

Kaur, M., Mann, S., and Kaur, K. (2020). Evaluation and impact assessment of training programmes. Agri. Update 15, 258–264. doi: 10.15740/HAS/AU/15.3/258-264

Crossref Full Text | Google Scholar

Lee, K. (2012). Augmented reality in education and training. TechTrends 56, 13–21. doi: 10.1007/s11528-012-0559-3

Crossref Full Text | Google Scholar

Lewis, J. R. (1994). Sample sizes for usability studies: additional considerations. Hum. Factors 36, 368–378. doi: 10.1177/001872089403600215

PubMed Abstract | Crossref Full Text | Google Scholar

Light, D. (1979). Surface data and deep structure: observing the organization of professional training. Adm. Sci. Q. 24, 551–559. doi: 10.2307/2392361

Crossref Full Text | Google Scholar

Microsoft Corporation (2019). HoloLens 2. Redmond, WA, USA: Microsoft Corporation.

Google Scholar

Nakanishi, M. (2010). “Human factor guideline for applying AR-based manuals in industry” in Augmented reality. ed. S. Maad (Rijeka: IntechOpen).

Google Scholar

Navab, N. (2004). Developing killer apps for industrial augmented reality. IEEE Comput. Graph. Appl. 24, 16–20. doi: 10.1109/MCG.2004.1297006

Crossref Full Text | Google Scholar

Ong, S. K., Yuan, M. L., and Nee, A. Y. C. (2008). Augmented reality applications in manufacturing: a survey. Int. J. Prod. Res. 46, 2707–2742. doi: 10.1080/00207540601064773

Crossref Full Text | Google Scholar

Owen, A., Wells, A., Harper, L., and Owen, A., (2024). Augmented reality in manufacturing: enhancing efficiency and safety through IoT solutions. Available at: https://www.researchgate.net/publication/389055805 (Accessed September 1, 2024).

Google Scholar

Quanta and Qualia Ltd., (2020) “MAGOS Gloves,” Available online at: https://www.themagos.com/ (Accessed December 3, 2025)

Google Scholar

Ritzmann, S., Hagemann, V., and Kluge, A. (2014). The training evaluation inventory (TEI) - evaluation of training design and measurement of training outcomes for predicting training success. Vocat. Learn. 7, 41–73. doi: 10.1007/s12186-013-9106-4

Crossref Full Text | Google Scholar

Santos, J. E., Nunes, M., Pires, M., and Rocha, J. C. “Generic XR game-based approach for industrial training,” in 2022 International Conference on Graphics and Interaction (ICGI), (2022), pp. 1–8. IEEE: Aveiro, Portugal.

Google Scholar

Turnbull, J., Gray, J., and Macfadyen, J. (1998). Improving in-training evaluation programs. J. Gen. Intern. Med. 13, 317–323. doi: 10.1046/j.1525-1497.1998.00097.x

PubMed Abstract | Crossref Full Text | Google Scholar

USE Questionnaire. (2025) USE questionnaire: usefulness, satisfaction, and ease of use. Available online at: https://garyperlman.com/quest/quest.cgi?form=USE. (Accessed December 3, 2025)

Google Scholar

Vacchetti, L., Lepetit, V., Ponder, M., Papagiannakis, G., Fua, P., Thalmann, D., et al. (2004). “A stable real-time AR framework for training and planning in industrial environments” in Virtual and augmented reality applications in manufacturing. eds. S. K. Ong and A. Y. C. Nee (London: Springer).

Google Scholar

Vidal-Balea, A., Blanco-Novoa, O., Fraga-Lamas, P., Vilar-Montesinos, M., and Fernández-Caramés, T. M. (2020). Creating collaborative augmented reality experiences for industry 4.0 training and assistance applications: performance evaluation in the shipyard of the future. Appl. Sci. 10, 1–23. doi: 10.3390/app10249073

PubMed Abstract | Crossref Full Text | Google Scholar

Virzi, R. A. (1992). Refining the test phase of usability evaluation: how many subjects is enough? Hum. Factors 34, 457–468. doi: 10.1177/001872089203400407

Crossref Full Text | Google Scholar

Vlachopoulos, S. P., Karageorghis, C. I., and Terry, P. C. (2000). Hierarchical confirmatory factor analysis of the flow state scale in an exercise setting. J. Sports Sci. 18, 815–823. doi: 10.1080/026404100419874

PubMed Abstract | Crossref Full Text | Google Scholar

Wang, Z., Bai, X., Zhang, S., Billinghurst, M., He, W., Wang, P., et al. (2022). A comprehensive review of augmented reality-based instruction in manual assembly, training and repair. Robot. Comput. Integr. Manuf. 78:102407. doi: 10.1016/j.rcim.2022.102407

Crossref Full Text | Google Scholar

Wang, X., and Dunston, P. S., (2007). Design, strategies, and issues towards an augmented reality-based construction training platform. Available online at: https://www.itcon.org/paper/2007/25 (Accessed September 5, 2024).

Google Scholar

Webel, S., Bockholt, U., Engelke, T., Peveri, M., Olbrich, M., and Preusche, C. (2011). Augmented reality training for assembly and maintenance skills. BIO Web Conf. 1:97. doi: 10.1051/bioconf/20110100097

Crossref Full Text | Google Scholar

Keywords: industrial training, augmented reality, human-computer interaction, manufacturing, human factor evaluation

Citation: Escallada O, Lasa G, Mazmela M, La Carrubba D, Bosani E, Dacal-Nieto A and García MV (2025) Exploring operator responses to augmented reality training: insights from the SELFEX platform case study. Front. Comput. Sci. 7:1507439. doi: 10.3389/fcomp.2025.1507439

Received: 10 October 2024; Accepted: 30 July 2025;
Published: 03 September 2025.

Edited by:

Fernando Moreu, University of New Mexico, United States

Reviewed by:

Ricardo José Vieira Baptista, Polytechnic Institute of Maia, Portugal
Didier Arl, Luxembourg Institute of Science and Technology (LIST), Luxembourg
Marlon Aguero, The University of New Mexico, United States

Copyright © 2025 Escallada, Lasa, Mazmela, La Carrubba, Bosani, Dacal-Nieto and García. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Oscar Escallada, b2VzY2FsbGFkYUBtb25kcmFnb24uZWR1

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.