Your new experience awaits. Try the new design now and help us make it even better

SYSTEMATIC REVIEW article

Front. Robot. AI, 13 October 2025

Sec. Human-Robot Interaction

Volume 12 - 2025 | https://doi.org/10.3389/frobt.2025.1626471

This article is part of the Research TopicThe Impact of Robotic Technologies on Customer Experience and AdoptionView all 4 articles

Exploring fear in human-robot interaction: a scoping review of older adults’ experiences with social robots

  • 1College of Science and Engineering, Hamad Bin Khalifa University, Doha, Qatar
  • 2Mada Center, Doha, Qatar

Background: As global populations age, healthcare and social systems face mounting pressure to provide effective support for older adults. Social robots have emerged as promising tools to enhance companionship, cognitive engagement, and daily assistance. However, fear of robots among older adults remains a critical barrier to adoption.

Objective: This scoping review examined how fear manifests in human-robot interaction (HRI), what factors contribute to these reactions, and how they influence technology acceptance.

Methods: A systematic search of six major databases (PubMed, Scopus, IEEE Xplore, ACM Digital Library, PsycINFO, and Web of Science) identified studies published between January 2014 and March 2025. Following PRISMA-ScR guidelines, 49 studies were included, encompassing 6,670 older participants across 16 countries.

Results: Thematic synthesis revealed seven main fear categories: privacy and autonomy concerns, trust and reliability issues, emotional and ethical discomfort, usability challenges, fear of dependence, unfamiliarity with technology, and the Uncanny Valley effect. Fear levels were shaped by robot design, cultural background, prior technology experience, and contextual factors such as care settings. Mitigation strategies, including co-design with older adults, gradual exposure, transparent system behavior, and emotionally congruent interaction, were associated with improved acceptance.

Conclusions: This review uniquely maps fear typologies to robot functions and intervention strategies, offering a framework to guide emotionally adaptive and culturally sensitive robot design. Addressing emotional barriers is essential for the ethical and effective integration of social robots into eldercare. Future research should prioritize longitudinal, cross-cultural studies and standardized fear measurement tools to advance evidence-based HRI implementation.

1 Introduction

1.1 Background

Picture an elderly resident meeting a humanoid robot for the first time. The mixture of fascination and wariness in their reaction captures a fundamental dilemma confronting societies as they introduce robotic technologies into eldercare settings. With global demographics shifting dramatically, estimates indicate that by 2050, one-sixth of the world’s population will exceed 65 years of age (World Health Organization, 2022). Healthcare systems worldwide grapple with shrinking caregiver workforces and stretched resources. Social robots have gained recognition as valuable tools that can provide companionship, enhance cognitive functioning, and support daily activities (Chen and Song, 2019; Papadopoulos et al., 2020; Zafrani et al., 2023). These robotic solutions span from therapeutic animal-inspired designs, such as PARO, to advanced humanoid platforms created specifically for elderly care environments (Bemelmans et al., 2012; Broadbent et al., 2009). Technology offers reliable care delivery, individualized interaction, and lighter caregiver burdens, potentially addressing widespread social isolation among aging populations (Abdi et al., 2018; Robinson et al., 2014). Nevertheless, emotional barriers, fear being foremost among them, frequently obstruct widespread adoption and effective use. This fear reaches beyond simple technological unfamiliarity, touching on profound psychological, technological, and cultural concerns (Pu et al., 2019; Whelan et al., 2018). Fear can emerge as discomfort during robot-human exchanges, skepticism about robotic competence, or worries about personal autonomy and data protection (Imtiaz et al., 2024; Tobis et al., 2022). Considering the significant financial commitments being made in eldercare robotics, recognizing and addressing these fear-based obstacles becomes crucial for optimizing their impact and securing widespread acceptance among older populations (Chen, S. et al., 2018; Sawik et al., 2023).

1.2 Complexity of robot-related fear

Elderly individuals’ fearful reactions to eldercare robots involve intricate, overlapping factors. Seniors frequently experience anxiety that stems not merely from encountering unfamiliar technology, but from fundamental concerns about maintaining independence, protecting privacy, and preserving the human elements of care (Heerink et al., 2010; Naneva et al., 2020). Fear intensity varies considerably, spanning from subtle uneasiness to pronounced anxiety that leads to complete rejection of robotic interaction (Berns and Ashok, 2024; Olatunji et al., 2025). While younger people might regard robotic malfunctions as minor annoyances, elderly users view such errors, whether involving medication mistakes or inadequate emergency assistance, as serious threats to their safety (Nomura et al., 2005; Sharkey and Sharkey, 2012). Furthermore, Mori’s “Uncanny Valley” theory provides valuable insight into these fears of robots, explaining the discomfort that occurs when robots appear almost human but lack complete authenticity in appearance or behavior (MacDorman and Ishiguro, 2006; Miklósi et al., 2017; Mori, 1970). Consequently, developing emotionally appealing robots requires careful attention to human-like features to prevent triggering revulsion instead of promoting acceptance. Privacy anxieties add another layer of complexity to these fears. Elderly users often express concern about information misuse, constant monitoring, and diminished personal control when robots track health data or observe daily routines (Coco et al., 2018; Rantanen et al., 2018). Additionally, fears about becoming overly dependent on robotic support reflect broader concerns about aging processes, declining independence, and reduced human contact in caregiving (Baisch et al., 2018; Moyle et al., 2019).

1.3 Cultural and individual variations

Responses to social robots vary markedly across different personal backgrounds and cultural settings, underscoring the influence of social factors on technology interactions (Stafford et al., 2014; Torta et al., 2014). Previous technology experience reliably diminishes fear; elderly individuals who have used digital devices extensively show reduced anxiety and greater willingness to work with robotic caregivers (Nomura, T. et al., 2005; Strutz et al., 2024). Cultural background also plays a major role in shaping the fear of robots. Research reveals substantial differences between Eastern and Western perspectives, with East Asian populations, particularly in Japan and South Korea, typically showing less fear and greater acceptance than Western groups, mirroring broader societal views on automation and care practices (Backonja et al., 2018; Zhao et al., 2023). Western participants often focus more heavily on autonomy and privacy issues, aligning with cultural traditions that emphasize individual choice and the irreplaceable nature of human caregiving relationships (Carros et al., 2020; Zsiga et al., 2018). In addition, age-related differences within the elderly population also prove meaningful. Those in advanced age brackets (85+ years) may demonstrate different fear characteristics compared to younger seniors (65–74 years), possibly reflecting variations in technology exposure, cognitive adaptability, and health requirements (Conde et al., 2024; Yam et al., 2023). Gender distinctions have surfaced as well, with women generally focusing on emotional and interpersonal aspects, while men tend to emphasize practical and technical considerations (Jung et al., 2017; Leung et al., 2022). These patterns depend heavily on context and represent broader social influences rather than fundamental gender-based differences.

1.4 Evolving technological landscape

Recent developments in artificial intelligence, machine learning, and human-computer interaction have dramatically reshaped social robotics capabilities (Tay et al., 2014; Walters et al., 2008). Modern robots now incorporate sophisticated natural language processing, emotion detection, and behavioral adaptation, enabling more tailored and sensitive user interactions (Cavallo et al., 2018; Deutsch et al., 2019; Walters et al., 2008). Research methodologies have similarly progressed beyond simple self-reporting to include physiological measurements, behavioral analysis, and unconscious psychological evaluation techniques (Gasteiger et al., 2025; Koceski and Koceska, 2016). These approaches demonstrate that fear operates through both conscious and unconscious pathways, informing how interventions might better address these reactions (Takayanagi et al., 2014; Thunberg et al., 2022). Contemporary robot development emphasizes user-focused design principles, concentrating on emotional security, trust establishment, and gradual relationship building while maintaining functional excellence (Park et al., 2021; Sun and Ye, 2024). Empirical studies confirm that features such as motion naturalness, expressive interaction, and adaptive dialogue strongly influence user trust, acceptance, and fear responses (Fraune et al., 2020; Huang et al., 2024; Lubold et al., 2016; Yuan et al., 2024).

1.5 Rationale for this study

Although substantial resources have been invested in eldercare robotics, fear of robots remains poorly understood and inconsistently measured across the research literature, hampering practical applications. The scoping review provides a framework for thoroughly examining this varied and rapidly developing field. Unlike systematic reviews, scoping studies can incorporate diverse methodological approaches, theoretical frameworks, and research inquiries, effectively surveying broad knowledge bases in emerging areas such as human-robot interaction (Broadbent et al., 2009; Złotowski et al., 2015). This methodology enables the integration of quantitative, qualitative, and mixed-method investigations, creating a comprehensive understanding of current knowledge, pinpointing significant research limitations, and guiding future research priorities and implementation approaches (Laue et al., 2017; Tschöpe et al., 2017).

Moreover, although several systematic and scoping reviews, such as (Antona et al., 2019; Baisch et al., 2018; Tobis et al., 2022) have examined social robot use in eldercare, none provide a comprehensive synthesis focused on fear as a central emotional factor in technology acceptance and robot integration. Existing studies often mention fear indirectly or as part of broader acceptance measures, leaving their specific triggers and categories poorly defined. This scoping review addresses that gap by systematically examining empirical evidence on older adults’ fear of robots, including near-human (Uncanny Valley) discomfort, privacy and autonomy concerns, and dependence-related anxieties, across diverse interaction contexts and robot types. A key contribution of this review lies in its structured classification of fear types and their relationship to robot design features and user diversity, using systematic coding in NVivo to extract consistent thematic patterns. By also highlighting cross-cultural differences in emotional responses, the review underscores the need for localized and culturally sensitive design approaches. These contributions have direct practical value: they provide designers and engineers with evidence-based cues to improve user comfort, offer policymakers and health planners guidance for gradual and ethical deployment, and help care practitioners and families better prepare older adults for first encounters with social robots. By linking emotional barriers with actionable design and implementation strategies, this review bridges the gap between research insights and real-world application in eldercare robotics.

The remainder of this paper follows this organization: Section 3 outlines the study objectives and research questions. Section 4 outlines the methodological approaches following PRISMA-ScR standards. Section 5 reports synthesized findings and thematic categorizations. Section 6 relates results to existing scholarship, highlights knowledge deficits, and suggests future research pathways. Section 7 concludes with practical guidance for robot design and deployment, emphasizing psychological and emotional factors essential for elderly acceptance.

1.5.1 Research questions

This investigation centers on three interconnected questions that emerged from our preliminary exploration of the literature and conversations with eldercare practitioners:

RQ1: What types of fear do older adults experience when interacting with social robots?

Instead of assuming fear is a uniform response, this question aimed to understand different ways fear and discomfort manifest when older adults interact with robots. This question explores both the obvious fears older adults readily describe, such as worries over their physical safety or privacy, and the more subtle, sometimes unconscious reactions that are shown through watching behavioral cues or taking physiological measures. The review is especially keen to see if different kinds of fears tend to group together or if they exhibit independently across different older adults and situations.

RQ2: What factors contribute to fear in older adults’ interactions with social robots?

Fear does not emerge spontaneously. This question examined the complex constellation of variables that shape older adults’ emotional responses to robotic systems. The investigation encompassed both observable characteristics, including robotic appearance and functional capabilities, and less apparent influences such as cultural contexts, prior technological encounters, and the social environments within which human-robot interaction occur. The study sought to identify the determinants of fear of robots and to understand how these diverse variables interconnect, potentially reinforcing or diminishing one another’s impact. Clarifying distinct fear categories helps caregivers and technology implementers with insights into customized interventions that can mitigate older adults’ fear.

RQ3: How does fear influence older adults’ acceptance and utilization of social robots?

The primary concern extends beyond types of fear manifestations to encompass their implications for technological adoption and sustained usage patterns. This investigation analyzed the mechanisms through which emotional responses are evident as behavioral outcomes, specifically, whether older adults completely avoid robotic systems, engage with them reluctantly, or develop strategies to surmount initial fears. The research focused particularly on determining whether fear reduction interventions could substantially enhance acceptance outcomes and identifying the underlying mechanisms that facilitate effective fear management.

2 Objectives

The principal objective was to construct a comprehensive synthesis of current knowledge regarding older adults’ fear of robots to social robots. This synthesis extended beyond mere cataloguing of existing studies to encompass understanding the landscape of research methodologies, participant demographics, and outcome measures employed in investigating fear of robots within human-robot interaction contexts. Through examination of this methodological diversity, the analysis aimed to identify both strengths and lacunae in contemporary research approaches. A secondary objective concentrated on elucidating how robotic characteristics influence emotional responses. Rather than conceptualizing robots as a homogeneous category, the investigation examined how specific design parameters encompassing appearance, movement patterns, interaction modalities, and intended functions shape fear of robots. This analysis provides evidence-based guidance for robot developers and implementers regarding design choices that may exacerbate or mitigate fear reactions. The third objective investigated the relationship between fear of robots and technology acceptance outcomes, exploring how emotional barriers influence older adults’ willingness to engage with robots and their patterns of actual usage. This analysis examined the potential for fear reduction interventions to improve acceptance and sustained engagement with social robots, while considering broader implications for eldercare technology implementation. Finally, the research aimed to present a conceptual framework that integrates findings across studies to illustrate relationships between fear types, contributing factors, mitigation strategies, and acceptance outcomes. This framework is designed to guide future research, inform intervention development, and support evidence-based decision-making in robot design and implementation. Rather than proposing a rigid theoretical model, the framework accommodates the complexity and variability observed in human-robot interaction while providing practical guidance for researchers and practitioners.

3 Methodology

3.1 Methodological framework and protocol development

The investigation was built on the methodological foundation established by (Arksey and O’malley, 2005), refined through Levac and colleagues’ subsequent improvements (Levac et al., 2010), and reported according to PRISMA-ScR guidelines (Tricco et al., 2018). This framework appealed to us because it provides structure for mapping complex, multidisciplinary topics while maintaining the flexibility essential for exploring emerging fields such as human-robot interaction in eldercare. In addition, the approach reflected a pragmatic approach, recognizing that understanding fear of robots in human-robot interaction requires drawing insights from gerontology, psychology, human-computer interaction, engineering, and healthcare (Johnson and Onwuegbuzie, 2004; Plano Clark, 2017). Rather than privileging any single disciplinary perspective, the study aimed to capture the full breadth of relevant knowledge while maintaining methodological rigor. A comprehensive review protocol was developed prior to initiating the search process. Although the protocol was not formally registered, this decision reflects the current limitations of platforms like PROSPERO, which accept only systematic reviews and meta-analyses. To maintain transparency and ensure reproducibility, the full protocol has been included as Supplementary Material 1, PRISMA-ScR Checklist Item. Its development involved collaborative discussions among the research team to define eligibility criteria, refine search strategies, and select appropriate synthesis methods. A preliminary pilot test using a small subset of studies (n = 5) allowed for practical adjustments, helping the team identify and address potential issues ahead of the full review process.

3.2 Eligibility criteria

Inclusion criteria balanced alignment with research objectives while maintaining feasible scope boundaries. The focus centered on adults aged 65 and older, consistent with established gerontological conventions, while recognizing the considerable diversity within this demographic. Studies examining mixed-age samples qualified for inclusion only when they offered distinct analyses for elderly participants or concentrated specifically on this population. Defining fear-related phenomena presented unexpected challenges. An expansive approach captured the complete spectrum of emotional responses, incorporating studies that examined fear, anxiety, discomfort, or other negative reactions that older adults displayed during interactions with humanoid or social robots. The inconsistent terminology found across research necessitated this broad conceptual framework to prevent overlooking valuable evidence.

The review incorporated multiple environmental contexts, residential settings, care institutions, research laboratories, and community spaces, reflecting the varied circumstances where elderly individuals encounter robotic technologies. Empirical investigations across all methodological approaches received consideration, encompassing quantitative, qualitative, mixed methods designs, and individual case studies. Case studies were included because they provide rich details about personal experiences, offering value when investigating emotional and psychological responses like fear within emerging technological domains. The temporal scope encompassed publications from January 2014 through March 2025 to maintain contemporary relevance. This timeframe encompasses recent developments in social robotics while excluding obsolete technologies. Resource constraints limited the search to English-language publications, potentially omitting relevant research published in other languages. Table 1 presents a comprehensive overview of inclusion and exclusion criteria.

Table 1
www.frontiersin.org

Table 1. Eligibility criteria used to determine study selection for this scoping review.

3.3 Search strategy

Developing an effective search strategy requires balancing sensitivity with specificity to capture all relevant studies while avoiding irrelevant results. Database selection aimed at comprehensive coverage across relevant disciplines. PubMed provided access to biomedical and medical literature, while IEEE Xplore captured engineering and technology perspectives. The ACM Digital Library covers virtually every aspect of computing and information technology, including Human-Robot Interaction (HRI), PsycINFO offers psychological research, Web of Science provides multidisciplinary coverage, and Scopus was selected for its broad multidisciplinary coverage, enabling the retrieval of peer-reviewed articles, conference papers, and gray literature across health, engineering, and social sciences. Additional studies were identified by hand-searching the included reference lists and relevant review articles. The last search was conducted on 15 March 2025. The search term development process involved identifying three primary concept domains: aging population terminology, robotic technology descriptors, and psychological response indicators. Within each domain, multiple synonyms and related terms were identified through preliminary searches, consultation with subject matter experts, and examination of key papers in the field. For the aging population domain, terms include “older adult,” “elderly,” “senior,” “aged,” “geriatric,” and “aging population.” The robotics domain encompasses “robot,” “humanoid,” “social robot,” “assistive robot,” “companion robot,” “socially assistive robot,” and “human-robot interaction.” The psychological response domain included “fear,” “anxiety,” “discomfort,” “negative emotion,” “acceptance,” “rejection,” “attitude,” “perception,” “uncanny valley,” and “technophobia.” The full list of search terms, including British and American spelling variations, is provided in Table 2.

Table 2
www.frontiersin.org

Table 2. Keywords used for the database search.

Boolean operators (AND, OR) were employed to combine terms within and across concept domains, adapting the strategy for each database while maintaining conceptual consistency. Truncation and wildcard symbols facilitated the capture of variations in terminology. Beyond database searching, the investigation included hand-searching the reference lists of included studies and relevant review articles, as well as conducting citation tracking for key papers to identify more recent work. The detailed search strategy, including full Boolean strings and field specifications for each electronic database, is documented in Supplementary Appendix A.

3.4 Selection process

Studies published from January 2014 to March 2025 were considered, aligning with the period of significant advancement in social and humanoid robotics relevant to eldercare (Goeldner et al., 2015). The review adhered to PRISMA-ScR guidelines (Tricco et al., 2018) to ensure methodological rigor and transparency. After duplicate removal in RefWorks, two reviewers independently screened titles and abstracts in Rayyan (Ouzzani et al., 2016) using predefined inclusion/exclusion criteria. Full texts of potentially eligible articles were then reviewed by both reviewers. Discrepancies were resolved by discussion or, when necessary, by a third reviewer. Cohen’s kappa coefficients indicated substantial agreement at both screening stages (title/abstract κ = 0.78; full text κ = 0.85). The database searches yielded a total of 4,083 records: PubMed (n = 123), IEEE Xplore (n = 418), ACM Digital Library (n = 620), PsycINFO (n = 4), Scopus (n = 2,346), and Web of Science (n = 572). An additional 12 studies were identified by hand-searching the reference lists of included articles, ensuring comprehensive coverage. Following duplicate removal and application of eligibility criteria, 49 studies were included in the final synthesis. The full selection process is illustrated in Figure 1 (PRISMA-ScR flow diagram).

Figure 1
Flowchart illustrating the process of study selection for a review. It includes four main stages: Identification, Screening, Eligibility, and Inclusion. Initially, 4,095 records are identified from various databases. After removing duplicates and ineligible records, 3,243 records are screened. From these, 2,391 are excluded due to various criteria such as wrong population or study type. Of 852 records sought for retrieval, 51 are not available. Finally, 801 records are assessed for eligibility, resulting in 752 exclusions. Ultimately, 49 studies are included in the review.

Figure 1. PRISMA-ScR (Tricco et al., 2018) flow diagram showing the study selection process with detailed exclusion reasons at each stage.

While this time frame ensures contemporary relevance, the exclusion of studies published before 2014 may omit foundational theoretical work in human-robot interaction. This limitation is further addressed in the Discussion section.

3.5 Data charting process

A standardized data extraction form was developed and piloted with five randomly selected studies: (Baisch et al., 2018; Chen and Song, 2019; Nault et al., 2024; Robinson et al., 2014; Ostrowski et al., 2019). These studies were selected to represent diversity in study design, robot type, and outcome measures. The pilot process revealed the need for additional fields related to fear assessment methods and mitigation strategies, leading to the refinement of the extraction form. Further, data extraction was then performed comprehensively across all included studies using the finalized form to ensure methodological rigor, reliability, and clarity. For each study, two reviewers independently extracted data, focusing on all elements critical to subsequent analysis and synthesis. Discrepancies were resolved by discussion and consensus.

The core elements of the extracted data are summarized in Table 3, and the complete extraction form is provided as Supplementary Material 2. Consistent application of the standardized extraction process was maintained throughout the review.

Table 3
www.frontiersin.org

Table 3. Data extraction elements and descriptions.

The full synthesis of extracted studies, including methodological approaches, key findings, and fear mitigation interventions, is provided in Supplementary Appendix Table 6. This comprehensive overview supports evidence based on the fear of robots among older adults interacting with humanoid robots.

3.6 Quality assessment: evaluating diverse study designs

The assessment of research rigor across incorporated investigations encountered distinctive obstacles stemming from varied experimental approaches. The Mixed Methods Appraisal Tool (MMAT), 2018 edition (Hong et al., 2018) served as the evaluation instrument, offering systematic examination capabilities for quantitative, qualitative, and combined methodological frameworks through a cohesive structure. Two independent assessors examined each investigation against design-appropriate MMAT standards, with consensus achieved through collaborative discussion for any initial disparities. The MMAT generates proportional scores (0%–100%) that reflect adherence to established quality benchmarks, allowing for cross-design comparisons. Among the 49 incorporated investigations, 11 (22.4%) were rated as high quality (80%–100%), 34 (69.4%) were assessed as moderate quality (60%–79%), and 4 (8.2%) were found to have lower methodological rigor (below 60%). These results are visualized in Figure 2, highlighting the predominance of moderate-quality studies, a reflection of both the emerging nature of the field and the practical difficulties associated with conducting rigorous empirical research in human-robot interaction involving older adults. The comprehensive methodological evaluation outcomes for all investigations are presented in Supplementary Material 3. Moreover, while all studies were retained in the synthesis regardless of quality rating, greater interpretative weight was assigned to findings from higher-quality studies. This approach ensures that conclusions are grounded in robust evidence while preserving a comprehensive view of the available literature.

Figure 2
Bar chart showing MMAT Quality Ratings of studies. High quality: 22.4% (11 studies), Moderate quality: 69.4% (34 studies), Low quality: 8.2% (4 studies). Y-axis represents count of studies.

Figure 2. Distribution of the included studies by methodological quality, assessed using the Mixed Methods Appraisal Tool (MMAT). Most studies were rated as moderate quality, with fewer achieving high or low ratings.

3.7 Synthesis of results

The methodological heterogeneity across incorporated investigations rendered meta-analytical approaches impractical. Consequently, descriptive synthesis integrated with thematic examination was implemented. Data management and analysis utilized NVivo 14.0, incorporating both inductive and deductive coding methodologies. Deductive themes were drawn from established theoretical foundations, including the Uncanny Valley hypothesis (Mori, 1970), technology acceptance frameworks (Davis et al., 1989; Venkatesh et al., 2003), and psychological models of trust and anxiety in human-machine interaction (Nomura et al., 2006). Simultaneously, the coding protocol accommodated emergent, data-derived themes. Two investigators conducted an independent study review and coding, convening regularly to address discrepancies and refine the developing thematic architecture. This cyclical process ensured analytical consistency while capturing both anticipated and unanticipated insights. The final synthesis matrix (Figure 3) was developed through iterative thematic coding and frequency analysis, capturing the association between specific fear types and mitigation strategies across the included studies. This heatmap visually conveys the strength of these associations, highlighting patterns of co-occurrence within the dataset. Dark cells indicate a higher number of studies reporting the linkage between a given fear and the corresponding mitigation approach that emerged through successive analytical phases, commencing with theory-informed structure and evolving in response to observed data patterns. To ensure transparency, each numerical value shown in Figure 3 is mapped to the exact study references in Supplementary Appendix C. This model illustrates the interconnections among fear of robots categories, influencing variables, mitigation approaches, and outcomes pertinent to research objectives. To enhance synthesis credibility, methodological quality assessment for each study employed the Mixed Methods Appraisal Tool (MMAT, version 2018) (Hong et al., 2018). Finding interpretation prioritized studies rated as high quality or of adequate quality strengthened the reliability of the synthesized themes.

Figure 3
Heatmap showing the relationship between fear types and mitigation strategies, with values representing the number of studies. Fear types include Uncanny Valley and Trust Issues. Mitigation strategies include Personalization and User Education. The color intensity ranges from light to dark blue, indicating zero to nine studies.

Figure 3. Heatmap showing how different fear types align with mitigation strategies across 49 studies. Darker shades indicate stronger evidence of association. Numbers indicate the count of studies (see Supplementary Appendix C for full mapping of each cell to specific studies). Example: The “6” in the Uncanny Valley × Personalization cell corresponds to Appel et al. (2019), Berns and Ashok (2024), Mishra et al. (2022), Strutz et al. (2024), Dosso et al. (2023), and Yam et al. (2023).

The synthesis process was guided by research questions, with particular attention to ensuring that findings related to fear factors (RQ2) and acceptance relationships (RQ3) were analyzed and presented with the same depth and rigor as those related to fear types (RQ1). Multiple analytical approaches were employed to address each research question comprehensively.

4 Results

4.1 Study selection and characteristics

The systematic search yielded 4,095 records across six databases and supplementary sources. During pre-screening, 852 records were excluded, 444 as duplicates, 379 via automated filters, and 29 for incomplete metadata or non-research formats. The remaining 3,243 records underwent title and abstract screening, leading to the exclusion of 2,391 articles based on the following criteria: non-technological focus (n = 530), irrelevant populations (n = 260), unsuitable study types (n = 370), outcomes unrelated to fear or acceptance (n = 210), language or access limitations (n = 142), out-of-range publication years (n = 77), non-peer-reviewed content (e.g., editorials, abstracts; n = 510), and insufficient methodological information (n = 292). Of the 852 full texts sought, 51 were unavailable or deemed out of scope, leaving 801 for detailed review. A further 752 were excluded for reasons including: absence of fear, anxiety, or acceptance focus (n = 305), lack of emphasis on social or humanoid robots (n = 126), exclusion of older adults (65+) as a study population (n = 94), insufficient methodological clarity (n = 68), ineligible design (n = 56), non-empirical format (n = 40), language/inaccessibility issues (n = 32), duplication (n = 15), and withdrawal or retraction (n = 16). The total of 49 studies met all inclusion criteria and were included in the final synthesis (Figure 1).

4.2 Publication years and trends

The temporal distribution of included studies (Figure 4) highlights a progressive increase in research activity addressing fear-related responses among older adults toward social robots between January 2014 and March 2025. This upward trajectory indicates sustained academic interest in understanding the emotional and psychological dimensions of human-robot interaction in aging populations. Notably, the years 2021 and 2024 recorded the highest volume of publications, with nine studies each, underscoring a surge in empirical focus during these periods. The trend reflects growing recognition within the research community that fear represents a substantive barrier to the adoption of robotic technologies in eldercare contexts. As such, the data point toward an urgent need for age-sensitive design approaches and targeted intervention strategies that address emotional safety and user trust.

Figure 4
Line graph showing the number of studies from 2014 to 2025. Numbers of studies by year: 2014 (4), 2015 (3), 2016 (2), 2017 (4), 2018 (5), 2019 (6), 2020 (2), 2021 (2), 2022 (7), 2023 (2), 2024 (9), 2025 (3). The graph fluctuates, showing peaks in 2019, 2022, and 2024.

Figure 4. Temporal distribution of the 49 included studies (2014–March 2025), with publication peaks observed in 2021 and 2024, indicating rising scholarly attention to fear of robots in older adults during human-robot interaction.

4.3 Sample characteristics

The 49 included studies involved a total of 6,670 older adult participants, with individual study sizes ranging from 12 to 384 (mean = 136; median = 67). Figure 5 shows the age distribution of participants. The largest proportion was 85 years and older (24.5%), followed by 65–69 years (20.8%), 70–74 years (18.9%), 75–79 years (18.6%), and 80–84 years (17.1%). This pattern reflects the field’s growing focus on very old adults, who represent the population most likely to interact with assistive and socially engaging robotic technologies. By including a substantial number of participants aged 85 years and above, recent studies capture the experiences of individuals who are both most in need of and most sensitive to the design, emotional safety, and usability of social robots.

Figure 5
Bar chart showing the number of participants across age bins: 65-69 years with 493 participants (20.8%), 70-74 years with 447 participants (18.9%), 75-79 years with 441 participants (18.6%), 80-84 years with 405 participants (17.1%), and 85+ with 579 participants (24.5%).

Figure 5. Age Distribution of Participants in 49 Included Studies (n = 6,670). The largest proportion of participants were 85+ years (24.5%), reflecting a focus on very old adults in studies of human-robot interaction.

4.4 Methodological approaches

Reviewed studies demonstrated considerable methodological variation. As illustrated in Figure 6, mixed-method designs dominated the literature, comprising 51.0% (25 studies). Qualitative approaches ranked second at 18.4% (9 studies), while exclusively quantitative methods constituted 8.2% (4 studies). Experimental designs accounted for 6.1% (3 studies), with additional methodologies including comparative analyses, cross-sectional surveys, and narrative reviews, each representing roughly 2.0%. This methodological heterogeneity reflects the intricate challenges involved in examining emotional responses among older adults during robotic interactions.

Figure 6
Bar chart depicting methodological design distribution in studies. Mixed Methods Studies lead with 51 percent (25 studies), followed by Qualitative Studies at 18.4 percent (9 studies). Others at 16.3 percent (8 studies), Quantitative Studies at 8.2 percent (4 studies), and Experimental Studies at 6.1 percent (3 studies) follow.

Figure 6. Distribution of methodological approaches across the included studies, showing the frequency of quantitative, qualitative, mixed-methods, and other designs used to explore fear in older adults’ interactions with robots.

4.5 Robotic platforms and assessment tools

The robotic systems examined in our analysis demonstrated considerable heterogeneity. Among the platforms investigated most frequently were Pepper (n = 2, 4.1%), Jibo (n = 2, 4.1%), PARO (n = 2, 4.1%), NAO (n = 2, 4.1%), and Kompaï (n = 2, 4.1%). Such diversity reflects both the dynamic development within social robotics research and reveals an absence of consistent methodological standards across contemporary investigations. Prior work highlights that differences in platform type can also shape perceived competence, engagement, and comfort in human-robot interaction (Görer et al., 2017; Harrison, 2015; Spatola et al., 2021). Furthermore, researchers employed varied approaches when assessing fear and anxiety responses, incorporating established psychometric instruments alongside purpose-built questionnaires and behavioral observation techniques.

4.6 Functional classification of robotic systems

The robotic technologies examined across the literature were categorized into five functional groups according to their predominant roles in geriatric care environments. Robots designed for assistance (7 studies, 14.3%), such as RAMCIP and Robot-Era platforms, were developed to help elderly individuals with routine activities, including movement support, medication scheduling, and personal hygiene tasks. Those serving therapeutic purposes (5 studies, 10.2%), notably Paro and Telenoid systems, concentrate on delivering emotional support and cognitive enhancement or facilitating physical recovery through purposefully designed interactive experiences. Platforms focused on social engagement (16 studies, 32.7%), including Pepper and NAO units, were primarily created to offer companionship while mitigating isolation and encouraging interpersonal connections among aging populations. Communication and surveillance systems (8 studies, 16.3%), exemplified by the Giraff platform, enable distant correspondence, medical assistance, and environmental observation, establishing essential links between seniors and both healthcare providers and relatives across geographic distances. Finally, integrated systems (13 studies, 26.5%) combine multiple functions, incorporating companion services, practical assistance, and supervisory capabilities for deployment in private residences or institutional care facilities. This category encompasses devices such as RobuLAB 10, LOVOT, and Ubtech Alpha Mini. The distribution patterns shown in Figure 7 reveal that contemporary research demonstrates marked interest in robots capable of social engagement and those offering combined functionalities.

Figure 7
Bar chart showing the number of studies by robot type. Socially Interactive Robots led with 16 studies, followed by Multi-purpose Robots with 13, Telepresence & Monitoring Robots with 8, Assistive Robots with 7, and Therapeutic Robots with 5.

Figure 7. Functional classification of robots used in the 49 included studies, with socially interactive and multi-purpose robots being the most frequently investigated types.

4.7 Fear assessment tools

There was notable variation in how the included studies assessed fear and emotional responses during human-robot interaction. The most frequently employed tool was the Likert scale (Likert, 1932), which was used in eight studies to quantify attitudes, comfort, and perceptions of robotic systems. Both the Negative Attitudes Toward Robots Scale (NARS) (Nomura, T. et al., 2005) and behavioral observation methods, such as video-based coding or documented reactions, were utilized in five studies. The Almere Model (Heerink et al., 2010) for technology acceptance, was reported in two studies. Other approaches, including open-ended interviews and thematic analysis, contributed further qualitative insight. However, no single standardized anxiety scale, such as the State-Trait Anxiety Inventory (Spielberger and Gorsuch, 1983) was identified in this sample. This diversity in assessment strategies highlights the field’s reliance on both structured quantitative instruments and behavioral or narrative methods to capture the range of fear and acceptance responses among older adults. Figure 8 summarizes the distribution of the main assessment tools used across all studies.

Figure 8
Horizontal bar chart showing the study count for different fear assessment tools. Almere Model has 2 studies (10%), Behavioral Indicators and NARS both have 5 studies (25%), and Likert Scales has 8 studies (40%).

Figure 8. Frequency of fear assessment tools used across the 49 included studies, showing the predominance of standardized scales such as NARS, and the Almere Model.

4.8 Thematic analysis: origins and types of fear

The thematic analysis in this scoping review aimed to identify, categorize, and contextualize fear-related responses of older adults interacting with robots. A detailed thematic analysis was conducted across the full texts of the 49 included studies to systematically investigate the origins and expressions of fear experienced by older adults during interactions with social robots. Each study was independently coded by two researchers, with discrepancies resolved through consensus discussions. NVivo tools were used to conduct matrix coding queries and visualize frequency distributions and co-occurrence of fear types, robot categories, and participant variables. Employing NVivo 14.0, a rigorous, multi-stage approach was implemented that blended inductive and deductive logic. This allowed both emergent and theory-driven themes to be captured and analyzed systematically. This dual approach allowed overt and subtle indicators of fear to be identified, ranging from explicit anxiety or avoidance to less immediately visible concerns, such as privacy, ethical discomfort, or feelings of dependence. Thematic coding began with a comprehensive word frequency analysis, focusing on qualitative data from all studies (Supplementary Material 4: Word Cloud illustrations). After standard preprocessing (stop word removal, stemming, and phrase grouping), common terms such as “fear,” “privacy,” “trust,” and “robot” emerged as highly salient. However, to move beyond mere frequency counts, themes were organized into seven principal categories based on both coding cycles and co-occurrence across robot types and user age groups. These emergent themes are summarized in Table 4. These encompassed the Uncanny Valley phenomenon, privacy and autonomy concerns, trust and reliability issues, dependence-related fears, emotional and ethical discomfort, usability obstacles, and insufficient prior technological exposure. The prevalence of these themes fluctuated according to both robotic morphology and participant demographics. For instance, participants aged 76–85 most reported Uncanny Valley phenomena discomfort or aversion elicited by lifelike yet subtly artificial humanoid platforms, such as NAO and Pepper. Conversely, privacy and autonomy concerns predominated among participants aged 81 and above, particularly during interactions with surveillance and remote presence systems, where users frequently expressed anxiety regarding observation and diminished personal control. Significantly, the youngest cohort (65–70 years) demonstrated a greater likelihood of experiencing fear due to technological unfamiliarity, though this was often ameliorated through structured exposure and supportive introduction protocols.

Table 4
www.frontiersin.org

Table 4. Themes of fear identified across 49 studies. Percentages are relative to the total number of included studies. Representative robot types, cohorts, and key fear characteristics are shown.

Eight different approaches to reducing fear were identified across the reviewed studies. Some of these were documented repeatedly, while others appeared only once or twice. The strategies most often described were participatory or co-design methods (n = 8), the use of emotional expressions such as affective speech or gestures (n = 6), and features that emphasized transparency and privacy (n = 6). Other techniques were far less common: gradual exposure protocols (n = 2), adaptive interface adjustments (n = 2), cultural tailoring (n = 1), personalization (n = 1), and context-responsive interactions (n = 3). The uneven distribution of these practices shows that current work is still exploratory, with little replication and limited consensus on best practice. In addition, the distribution of strategies was closely tied to the function of the robots themselves and the situations in which they were introduced. Social engagement robots were most often linked with participatory design and emotionally supportive interactions, which fit their role in companionship and social contact. Assistance-oriented and therapeutic robots showed more moderate use of emotional regulation and gradual exposure, reflecting their deployment in supportive or rehabilitative contexts. In contrast, remote presence and integrated multi-function robots were largely associated with transparency and privacy-related measures, along with some participatory elements. Taken together, the evidence suggests that mitigation strategies are applied across all categories of robots, but with considerable variation. For transparency, Figures 9, 10 show only the counts, while the detailed study-by-study mapping is provided in Supplementary Appendixs D, E.

Figure 9
Heatmap showing robot functional categories versus mitigation strategies. Categories include Assistance-oriented, Therapeutic-focused, Social engagement, Remote presence/surveillance, and Integrated multi-function. Strategies include User-Centered Design, Gradual Exposure, and more. The number of studies, ranging from 0 to 8, is depicted with varying shades of blue, where darker signifies higher counts.

Figure 9. Heatmap showing the number of studies linking robot categories to fear-reduction strategy domains. Numbers indicate the count of studies; see Supplementary Appendix D for the full mapping. Example: The “8” for Social Engagement Robots × User-Centered Design corresponds to Carros et al. (2020), Søraa et al. (2022), Ostrowski et al. (2019), Ostrowski et al. (2024), Strutz et al. (2024), Nault et al. (2024), Zafrani et al. (2023), and Yam et al. (2023).

Figure 10
Heatmap showing the relationship between robot functional categories and types of fear. Categories include assistance-oriented, therapeutic-focused, social engagement, remote presence/surveillance, and integrated multi-function. Fear types include uncanny valley, privacy and autonomy, trust and reliability, dependence, emotional/ethical, usability, and lack of exposure. The color scale represents the number of studies, from light to dark blue, indicating fewer to more studies.

Figure 10. Heatmap summarizing how mitigation strategies are distributed across different functional categories of robots. The numbers indicate how many studies reported each link. Full reference lists for the studies represented in each cell are provided in Supplementary Appendix E. For example, the value “7” in the cell for Integrated Multi-function Robots × User-Centered Design corresponds to Rigaud et al. (2024), Zsiga et al. (2018), Carros et al. (2020), Ostrowski et al. (2024), Nault et al. (2024), Olatunji et al. (2025), and Gasteiger et al. (2025).

Figure 10 provides an overview of how mitigation strategies are distributed across different categories of robots used in eldercare. Five main groups are represented—assistance-oriented, therapeutic, socially interactive, remote presence, and integrated multi-function platforms—set against four domains of intervention: emotional regulation, participatory or user-centered design, privacy and autonomy safeguards, and context-sensitive interaction. Patterns varied across robot types. Social and therapeutic robots were often associated with discomfort linked to human-like appearance and emotional unease. In these cases, design choices that emphasized user involvement and emotionally supportive interaction were the most frequently reported strategies. By contrast, concerns over surveillance, loss of control, and data handling were more often raised in relation to remote presence and multifunctional systems, where transparency and explicit user control measures were seen as central. Assistance-oriented devices drew on a combination of participatory design, simplified interfaces, and privacy safeguards to address similar issues. In addition, the distribution of strategies also differed by user group. Older participants expressed stronger reactions to uncanny valley effects and emotional discomfort, whereas younger and more technologically familiar cohorts showed lower levels of fear and engaged more readily with the devices. As the heatmap indicates, socially interactive and therapeutic robots were more frequently linked with user-centered and emotional regulation approaches, while remote and assistive systems tended to emphasize privacy protections and usability. These differences underline the importance of tailoring fear-reduction measures not only for the functional purpose of the robot but also to the characteristics and expectations of the people interacting with it. Full details of the study mappings that underpin these patterns are available in Supplementary Appendix E.

5 Discussion

This scoping review highlights the complex and multidimensional nature of fear of robots among older adults interacting with social robotic systems. As populations age globally, understanding and mitigating these emotional responses is critical to the responsible integration of robotic technologies in geriatric care. The discussion situates the findings within key theoretical frameworks, including the Uncanny Valley Hypothesis (Mori et al., 2012) and the Technology Acceptance Model (TAM), and examines emerging patterns across demographic, cultural, and robot design factors. Table 5 summarizes the study’s three guiding research questions (RQs), the main thematic findings, illustrative insights drawn from each theme, and remaining gaps identified in the literature.

Table 5
www.frontiersin.org

Table 5. Central research questions (RQs) thematic results.

5.1 Types of fear in human-robot interaction (RQ1)

Older adults’ fear of robots during interactions with social robots typically falls into four primary categories: anticipatory anxiety, uncanny valley effects, perceived loss of autonomy, and functional distrust. These categories collectively shape both emotional and behavioral reactions in human-robot encounters. Anticipatory fear stems from uncertainty about the robot’s intentions or next actions. For example, (Lima et al., 2022), reported that older users expressed anxiety when robots acted unpredictably or failed to communicate with a clear intent. Uncanny valley reactions, based on the well-established framework by Mori, (1970) and later expanded by (Macdorman and Minato, 1970; Mori et al., 2012), describe discomfort caused by humanoid robots that appear nearly, but not fully, human. Studies such as (Mishra et al., 2022; Tulsulkar et al., 2021) noted that elderly participants reacted negatively to robots exhibiting near-human traits like blinking, gesturing, or artificial voice, which reduced willingness to engage. Comparable findings demonstrate that robot appearance, movement quality, and social presence cues are central to triggering or alleviating fear in older adults (Fraune et al., 2020; Görer et al., 2017; Huang et al., 2024; Spatola et al., 2021; Yuan et al., 2024). Concerns around autonomy and privacy were particularly salient in healthcare contexts (Søraa et al., 2022). found that anxiety increased when robots collected sensitive information or operated independently. Similarly, (Dosso et al., 2023), observed that dependency on robots for essential tasks like medication reminders or mobility support raised fears of emotional distancing and reduced human oversight.

Functional skepticism, or doubts about the robot’s reliability, was another key theme (Ostrowski et al., 2019). highlighted that older adults feared malfunctions or inappropriate responses from robotic caregivers, potentially endangering safety or diminishing human involvement. Despite these consistent observations, a significant methodological limitation persists, while some investigations explicitly measured fear using structured instruments (Appel et al., 2019; Macdorman and Minato, 1970; Mori, M. et al., 2012; Pino et al., 2015), most inferred fear indirectly, utilizing behavioral withdrawal, qualitative indicators, or broader attitude scales such as NARS (Nomura et al., 2006) and the Almere Model (Heerink et al., 2010). Consequently, the absence of a standardized framework for categorizing and measuring fear types in human-machine interaction with elderly populations constrains the capacity to conduct comparative analyses or develop targeted interventions.

5.2 Origins of fear: internal and external influences (RQ2)

The fear of robots toward social robotic platforms is influenced not solely by the platforms’ physical appearance or behavior but also by deeper psychological and socio-cultural elements. Four primary origins were identified: media influence and fictional narratives, previous adverse technology experiences, social and peer influence, and the generational digital divide (see Table 5). Media narratives and fictional portrayals exert substantial influence on elderly individuals’ perceptions of robotic platforms. Investigations by (Bevilacqua et al., 2021; Liu et al., 2023) determined that many elderly participants referenced dystopian science fiction scenarios, including robotic rebellion, enhanced surveillance, or diminished human connection, demonstrating these cultural narratives were internalized. Even when engaging with basic assistive platforms, some participants expressed concerns about monitoring or replacement, obscuring distinctions between imagination and reality. Previous adverse technology experiences also contributed to skepticism and distrust (Fraune et al., 2022). observed that frustration with digital health applications, automated teller machines, or voice assistants fostered general reluctance to trust emerging technologies. Elderly individuals with prior negative experiences using smartphones or similar devices demonstrated a greater likelihood of perceiving robotic platforms as unreliable or emotionally detached, a distrust that often developed before any direct platform interaction.

Social and peer influences demonstrated the importance of shaping acceptance or fear (Shih et al., 2023). revealed that elderly participants were more receptive to robotic platforms when friends or caregivers demonstrated positive engagement, while negative social cues could intensify anxiety (Robinson et al., 2013). These findings suggest that robotic fear is often socially constructed, not merely an individual response. The generational digital divide further intensified apprehensive responses. Investigations (Bevilacqua et al., 2021; Destephe et al., 2015; Gomez-Hernandez, 2024; Shih et al., 2023) indicated that elderly individuals with limited digital literacy found robotic platforms more foreign and intimidating. Conversely, those comfortable with smartphones or tablets demonstrated reduced fear and greater acceptance of robotic platforms, showing that technological familiarity generally diminishes concern. While cultural and demographic elements, such as robotic appearance and interaction style, also influence apprehensive responses (Bevilacqua et al., 2021; Destephe et al., 2015; Gomez-Hernandez, 2024; Shih et al., 2023). These should be understood as contextual amplifiers rather than fundamental causes. Despite recognition of these elements, most investigations do not distinguish between immediate triggers and deeper sources of fear. A robust conceptual framework is needed to separate proximal (contextual) triggers from underlying (internalized) origins, enabling the development of emotionally intelligent and culturally sensitive robotic platforms for elderly populations.

5.3 Influence of fear on acceptance and utilization (RQ3)

While perceived functionality and ease of use are foundational to the Technology Acceptance Model (TAM) (Silva, 2015), this review confirms that fear is a primary emotional barrier to both the acceptance and sustained use of social robots by older adults. Unusual robot appearance, anthropomorphic traits, and privacy concerns frequently lead to discomfort, withdrawal, or outright rejection of robotic systems, even when users acknowledge their potential benefits (Patel and Rughani, 2022). Emotional authenticity and perceived surveillance are particularly important for companionship and social interaction robots, with many older adults expressing resistance due to a lack of genuine effect or concerns about being monitored (Pu et al., 2019). Moreover, digital literacy further moderates these outcomes. Older adults with lower digital confidence are more likely to avoid robot interaction in the face of intimidation or unfamiliarity (Fraune et al., 2022). In contrast, interventions featuring adaptive robot behaviors such as friendlier communication, slower movement, or personalized language have been shown to enhance trust and acceptance (Shih et al., 2023; Søraa et al., 2022) highlight the value of personalizing user interfaces and interaction parameters, especially in healthcare, to reduce anxiety and foster a sense of control. A noteworthy methodological gap remains: few studies measures baseline fear before interaction or track changes over time, leaving the trajectory of fear (whether it diminishes or intensifies with exposure) largely unknown. Although some intervention studies have measured subtle emotional shifts longitudinally (Bradwell, Hannah, 2021; Dosso et al., 2023), most focus primarily on usability rather than addressing fear as a psychological construct. Nonetheless, consistent evidence shows that familiarization sessions, peer modeling, and pre-exposure orientation can mitigate fear, even for initially reluctant users (Shih et al., 2023). These findings underscore the need to explicitly integrate affective variables, fear, trust, and emotional safety into future iterations of the Technology Acceptance Model. Transparent data usage, user control, and emotionally congruent robot behaviors are all essential for fostering acceptance. Design features such as clear privacy policies, manual overrides, and predictable, slow movements can help alleviate concerns about autonomy and surveillance, ultimately supporting both therapeutic engagement and emotional wellbeing during technology adoption.

6 Gaps in literature and future directions

Although research on Human–Robot Interaction (HRI) with older adults has expanded considerably, several unresolved gaps continue to limit progress in understanding fear and its implications for robot acceptance. These gaps can be grouped into three broad areas: longitudinal inquiry, cultural sensitivity, and multimodal methodologies.

Longitudinal needs: Much of the current work on fear in HRI with older adults is based on short trials or one-off encounters. These designs capture immediate impressions but cannot tell us how fear unfolds with repeated exposure. It remains unclear whether initial anxiety fades with familiarity, persists as avoidance, or develops into more complex emotional responses. Reviews of the field consistently note that longitudinal evidence is scarce and that most studies rely on brief, controlled interventions (Bradwell, 2021; Broadbent et al., 2009). To move beyond these snapshots, large-scale projects that follow participants over months or years are needed. Long-term studies in real-world care environments such as nursing homes, assisted living facilities, and private households would help clarify whether and how older adults adapt to robots in everyday life. Without this evidence, our picture of how fear develops or recedes over time remains incomplete.

Cultural dimensions: Fear of robots is not uniform across cultural contexts. While studies from East Asia often report relatively positive responses and fewer concerns about autonomy (Yam et al., 2023; Zafrani and Nimrod, 2018) work from Europe and North America highlights anxieties about privacy, surveillance, and reduced personal agency (Coco et al., 2018; Rantanen et al., 2018). Yet, systematic cross-cultural comparisons remain rare. Rather than assuming a universal emotional trajectory, future research should investigate how values, norms, and expectations shape fear-related reactions. This raises the question of whether robots should be designed with culturally specific features or whether universal design frameworks can be adapted through modular personalization. Linking Table 4 with cultural contexts would help clarify which fear categories are more salient in different regions, thereby guiding culturally responsive robot design.

Methodological and multimodal considerations: Another weakness in the current work is methodological. Heavy reliance on cross-sectional surveys and self-report scales risks underestimating implicit or nuanced forms of fear, especially in populations with cognitive decline. Few studies include older adults with moderate-to-severe dementia, despite the frequent use of robots in dementia care (Baisch et al., 2017; Dosso et al., 2023). Multimodal approaches that integrate physiological markers, such as galvanic skin response, heart rate variability, eye-tracking, behavioral observation, and interviews, would capture both overt reactions and subtle affective states. Mixed methods design combining these measures with qualitative accounts can uncover how fear is experienced, narrated, and expressed in different settings (Zafrani et al., 2023). Importantly, most existing studies have been conducted in controlled laboratory environments. Longitudinal ethnographic research in naturalistic care settings would provide richer insights into the ways fear manifests in everyday interactions.

Emerging tools: Virtual reality (VR) offers a promising avenue for advancing fear research in HRI. Controlled simulations allow researchers to vary robot appearance, behaviors, and potential malfunctions without exposing participants to physical risks. This is especially useful for investigating phenomena such as the Uncanny Valley or responses to unexpected breakdowns. VR can also support iterative prototyping before robots are physically deployed. However, its use in older populations requires caution, as VR headsets may induce discomfort or fail to replicate the complexity of real-world interaction.

Stratification and diversity: Fear in HRI is not monolithic; it varies across age brackets, cognitive status, and prior experience. Early evidence suggests that younger cohorts of older adults (65–74) often express anxiety linked to unfamiliarity, whereas those over 75 are more likely to highlight privacy or autonomy concerns (Baisch et al., 2017; Bradwell, 2021). Stratified analyses by age, cognitive condition, and cultural background are essential to develop context-aware, emotionally adaptive robots that address diverse needs.

7 Conclusion

This scoping review examined 49 studies published between 2014 and 2025 on older adults’ experiences of fear when interacting with robots. The findings suggest that fear is expressed in multiple ways, including worries about privacy, trust, dependence, emotional unease, and the Uncanny Valley effect. These responses were shaped by factors such as prior technology use, age, cognitive condition, and cultural context. For instance, participants with greater digital experience tended to report less fear, while studies from Western settings often emphasized privacy and surveillance concerns. Taken together, the evidence provides a broad map of how fear manifests in HRI and where future work should focus.

Limitations: The review has several limitations. Some relevant research may not have been captured, especially studies reported in non-English outlets. The included studies were highly diverse in design and outcome measures, which limited systematic comparison. Few papers involved participants with significant cognitive impairment, leaving questions about this group unanswered. Finally, as a scoping review, no formal grading of study quality was conducted, meaning that the strength of evidence cannot be ranked.

Implications and future directions: Despite these limitations, this review makes three key contributions. It consolidates evidence on the forms and triggers of fear in HRI, it highlights major gaps such as the scarcity of longitudinal and culturally comparative work, and it provides a framework for integrating multimodal methods into future studies. For designers, the results point to the value of transparent, user-informed design that avoids deceptive human-like cues. For care providers, gradual exposure and supportive introduction can help reduce initial anxiety. For policymakers, the findings underscore the need for culturally sensitive guidelines that balance innovation with the emotional wellbeing of older adults. Rather than viewing fear only as an obstacle, it should be treated as a design signal that can inform the development of robots that are transparent, trustworthy, and responsive to the needs of older adults. Confronting these fears directly is essential if robots are to be integrated into eldercare in ways that are both safe and genuinely supportive.

Data availability statement

The original contributions presented in the study are included in the article/Supplementary Material, further inquiries can be directed to the corresponding author.

Author contributions

AE: Conceptualization, Formal Analysis, Investigation, Methodology, Project administration, Visualization, Writing – review and editing, Validation, Writing – original draft. DA-T: Funding acquisition, Resources, Supervision, Writing – review and editing. AO: Resources, Supervision, Validation, Writing – review and editing.

Funding

The author(s) declare that no financial support was received for the research and/or publication of this article.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Generative AI statement

The author(s) declare that no Generative AI was used in the creation of this manuscript.

Any alternative text (alt text) provided alongside figures in this article has been generated by Frontiers with the support of artificial intelligence and reasonable efforts have been made to ensure accuracy, including review by the authors wherever possible. If you identify any issues, please contact us.

Publisher’s note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

Supplementary material

The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/frobt.2025.1626471/full#supplementary-material

References

Abdi, J., Al-Hindawi, A., Ng, T., and Vizcaychipi, M. P. (2018). Scoping review on the use of socially assistive robot technology in elderly care. BMJ Open 8 (2). doi:10.1136/bmjopen-2017-018815

PubMed Abstract | CrossRef Full Text | Google Scholar

Antona, M., Ioannidi, D., Foukarakis, M., Gerlowska, J., Rejdak, K., Abdelnour, C., et al. (2019). My robot is happy today, 416, 424. doi:10.1145/3316782.3322777

CrossRef Full Text | Google Scholar

Appel, M., Izydorczyk, D., Weber, S., Mara, M., and Lischetzke, T. (2019). The uncanny of mind in a machine: humanoid robots as tools, agents, and experiencers. Comput. Hum. Behav. 102, 274–286. doi:10.1016/j.chb.2019.07.031

CrossRef Full Text | Google Scholar

Arksey, H., and O'malley, L. (2005). Scoping studies: towards a methodological framework. Int. J. Soc. Res. Methodol. 8 (1), 19–32. doi:10.1080/1364557032000119616

CrossRef Full Text | Google Scholar

Backonja, U., Hall, A. K., Painter, I., Kneale, L., Lazar, A., Cakmak, M., et al. (2018). Comfort and attitudes towards robots among young, middle-aged, and older adults: a cross-sectional study. Wiley. doi:10.1111/jnu.12430

CrossRef Full Text | Google Scholar

Baisch, S., Kolling, T., Schall, A., Rühl, S., Selic, S., Kim, Z., et al. (2017). Acceptance of social robots by elder people: does psychosocial functioning matter? Int. J. Soc. Robotics 9 (2), 293–307. doi:10.1007/s12369-016-0392-5

CrossRef Full Text | Google Scholar

Baisch, S., Kolling, T., Klein, B., Pantel, J., Oswald, F., and Knopf, M. (2018). Dynamic interplay between general experience and robot-specific expertise at older adults‘ first encounter with a robot: role for robot acceptance and implications for robot design. Gerontechnology 17 (4), 215–231. doi:10.4017/gt.2018.17.4.003.00

CrossRef Full Text | Google Scholar

Bemelmans, R., Gelderblom, G. J., Jonker, P., and De Witte, L. (2012). Socially assistive robots in elderly care: a systematic review into effects and effectiveness. J. Am. Med. Dir. Assoc. 13 (2), 114–120.e1. doi:10.1016/j.jamda.2010.10.002

PubMed Abstract | CrossRef Full Text | Google Scholar

Berns, K., and Ashok, A. (2024). You scare me: the effects of humanoid robot appearance, emotion, and interaction skills on uncanny valley phenomenon. Paper presented at the actuators. Actuators 13 (10), 419. doi:10.3390/act13100419

CrossRef Full Text | Google Scholar

Bevilacqua, R., Felici, E., Cavallo, F., Amabili, G., and Maranesi, E. (2021). Designing acceptable robots for assisting older adults: a pilot study on the willingness to interact. Int. J. Environ. Res. Public Health 18 (20), 10686. doi:10.3390/ijerph182010686

PubMed Abstract | CrossRef Full Text | Google Scholar

Bradwell, H. (2021). Exploring the design, use and impact of companion pet robots and automata for older adults and people with dementia.

Google Scholar

Bradwell, H. L., Winnington, R., Thill, S., and Jones, R. B. (2021). Morphology of socially assistive robots for health and social care: a reflection on 24 months of research with anthropomorphic, zoomorphic and mechanomorphic devices, 376, 383. doi:10.1109/ro-man50785.2021.9515446

CrossRef Full Text | Google Scholar

Broadbent, E., Stafford, R., and MacDonald, B. (2009). Acceptance of healthcare robots for the older population: review and future directions. Int. J. Soc. Robotics 1 (4), 319–330. doi:10.1007/s12369-009-0030-6

CrossRef Full Text | Google Scholar

Carros, F., Meurer, J., Löffler, D., Unbehaun, D., Matthies, S., Koch, I., et al. (2020). Exploring human-robot interaction with the elderly. Paper presented at the doi:10.1145/3313831.3376402

CrossRef Full Text | Google Scholar

Cavallo, F., Esposito, R., Limosani, R., Manzi, A., Bevilacqua, R., Felici, E., et al. (2018). Robotic services acceptance in smart environments with older adults: user satisfaction and acceptability study. J. Med. Internet Res. 20 (9), e264. doi:10.2196/jmir.9460

PubMed Abstract | CrossRef Full Text | Google Scholar

Chen, S., Jones, C., and Moyle, W. (2018). Social robots for depression in older adults: a systematic review. J. Nurs. Scholarsh. 50 (6), 612–622. doi:10.1111/jnu.12423

PubMed Abstract | CrossRef Full Text | Google Scholar

Chen, L., B., Song, J., and Li, B. (2019). Providing aging adults social robots’ companionship in home-based elder care. J. Healthc. Eng. 2019 (2019), 1–7. doi:10.1155/2019/2726837

PubMed Abstract | CrossRef Full Text | Google Scholar

Coco, K., Kangasniemi, M., and Rantanen, T. (2018). Care personnel's attitudes and fears toward care robots in elderly care: a comparison of data from the care personnel in Finland and Japan. J. Nurs. Scholarsh. 50 (6), 634–644. doi:10.1111/jnu.12435

PubMed Abstract | CrossRef Full Text | Google Scholar

Conde, M., Mikhailova, V., and Döring, N. (2024). “I have the Feeling that the Person is Here”: older Adults’ attitudes, usage intentions, and requirements for a telepresence robot. Int. J. Soc. Robotics 16 (7), 1619–1639. doi:10.1007/s12369-024-01143-z

CrossRef Full Text | Google Scholar

Davis, F. D. (1989). Perceived usefulness, perceived ease of use and user acceptance of information technology. MIS Q. 13, 319. doi:10.2307/249008

CrossRef Full Text | Google Scholar

Destephe, M., Brandao, M., Kishi, T., Zecca, M., Hashimoto, K., and Takanishi, A. (2015). Walking in the uncanny valley: importance of the attractiveness on the acceptance of a robot as a working partner. Front. Psychol. 6, 204. doi:10.3389/fpsyg.2015.00204

CrossRef Full Text | Google Scholar

Deutsch, I., Erel, H., Paz, M., Hoffman, G., and Zuckerman, O. (2019). Home robotic devices for older adults: opportunities and concerns. Comput. Hum. Behav. 98, 122–133. doi:10.1016/j.chb.2019.04.002

CrossRef Full Text | Google Scholar

Dosso, J. A., Kailley, J. N., Guerra, G. K., and Robillard, J. M. (2023). Older adult perspectives on emotion and stigma in social robots. Front. Psychiatry 13, 1051750. doi:10.3389/fpsyt.2022.1051750

CrossRef Full Text | Google Scholar

Fraune, M. R., Oisted, B. C., Sembrowski, C. E., Gates, K. A., Krupp, M. M., and Šabanović, S. (2020). Effects of robot-human versus robot-robot behavior and entitativity on anthropomorphism and willingness to interact. Comput. Hum. Behav. 105, 106220. doi:10.1016/j.chb.2019.106220

CrossRef Full Text | Google Scholar

Fraune, M. R., Komatsu, T., Preusse, H. R., Langlois, D. K., Au, R. H., Ling, K., et al. (2022). Socially facilitative robots for older adults to alleviate social isolation: a participatory design workshop approach in the US and Japan. Front. Psychol. 13, 904019. doi:10.3389/fpsyg.2022.904019

PubMed Abstract | CrossRef Full Text | Google Scholar

Gasteiger, N., Ahn, H. S., Lee, C., Lim, J. Y., Macdonald, B. A., Kim, G. H., et al. (2025). Participatory design, development, and testing of assistive health robots with older adults: an international four-year project. ACM Trans. Human-Robot Interact. 11 (4), 1–19. doi:10.1145/3533726

CrossRef Full Text | Google Scholar

Giorgi, I., Tirotto, F. A., Hagen, O., Aider, F., Gianni, M., Palomino, M., et al. (2022). Friendly but faulty: a pilot study on the perceived trust of older adults in a social robot. IEEE Access 10, 92084–92096. doi:10.1109/ACCESS.2022.3202942

CrossRef Full Text | Google Scholar

Goeldner, M., Herstatt, C., and Tietze, F. (2015). The emergence of care robotics—A patent and publication analysis. Technol. Forecast. Soc. Change 92, 115–131. doi:10.1016/j.techfore.2014.09.005

CrossRef Full Text | Google Scholar

Gomez-Hernandez, M. (2024). Industry visions of technology for older adults: a futures anthropology perspective. J. Aging Stud. 70, 101248. doi:10.1016/j.jaging.2024.101248

PubMed Abstract | CrossRef Full Text | Google Scholar

Görer, B., Salah, A. A., and Akın, H. L. (2017). An autonomous robotic exercise tutor for elderly people. Aut. Robots 41, 657–678. doi:10.1007/s10514-016-9598-5

CrossRef Full Text | Google Scholar

Harrison, N. (2015). Caredroids in health care. Lancet 386 (9990), 235–236. doi:10.1016/s0140-6736(15)61267-3

CrossRef Full Text | Google Scholar

Heerink, M., Kröse, B., Evers, V., and Wielinga, B. (2010). Assessing acceptance of assistive social agent technology by older adults: the almere model. Assess. Accept. Assistive Soc. Agent Technol. by Older Adults Almere Model 2, 361–375. doi:10.1007/s12369-010-0068-5

CrossRef Full Text | Google Scholar

Hong, Q. N., Fàbregues, S., Bartlett, G., Boardman, F., Cargo, M., Dagenais, P., et al. (2018). The mixed methods appraisal tool (MMAT) version 2018 for information professionals and researchers. Educ. Inf. 34 (4), 285–291. doi:10.3233/efi-180221

CrossRef Full Text | Google Scholar

Huang, P., Hu, Y., Nechyporenko, N., Kim, D., Talbott, W., and Zhang, J. (2024). EMOTION: expressive motion sequence generation for humanoid robots with In-Context learning.doi:10.48550/arxiv.2410.23234

CrossRef Full Text | Google Scholar

Imtiaz, R., and Khan, A. (2024). Perceptions of humanoid robots in caregiving: a study of skilled nursing home and long term care administrators, 731, 737. doi:10.5220/0012465400003657

CrossRef Full Text | Google Scholar

Johnson, R. B., and Onwuegbuzie, A. J. (2004). Mixed methods research: a research paradigm whose time has come. Educ. Res. 33 (7), 14–26. doi:10.3102/0013189x033007014

CrossRef Full Text | Google Scholar

Jung, M. M., Van der Leij, L., and Kelders, S. M. (2017). An exploration of the benefits of an animallike robot companion with more advanced touch interaction capabilities for dementia care. Front. ICT 4, 16. doi:10.3389/fict.2017.00016

CrossRef Full Text | Google Scholar

Koceski, S., and Koceska, N. (2016). Evaluation of an assistive telepresence robot for elderly healthcare. J. Med. Syst. 40 (5), 121. doi:10.1007/s10916-016-0481-x

PubMed Abstract | CrossRef Full Text | Google Scholar

Laue, C., Cheok, A. D., Portalés, C., and Edirisinghe, C. (2017). Familiar and strange: gender, sex, and love in the uncanny valley. Multimodal Technol. Interact. 1 (1), 2. doi:10.3390/mti1010002

CrossRef Full Text | Google Scholar

Leung, A. Y., Zhao, I. Y., Lin, S., and Lau, T. K. (2022). Exploring the presence of humanoid social robots at home and capturing human-robot interactions with older adults: experiences from four case studies. Paper presented at the healthcare, 11(1) 39. doi:10.3390/healthcare11010039

PubMed Abstract | CrossRef Full Text | Google Scholar

Levac, D., Colquhoun, H., and O'brien, K. K. (2010). Scoping studies: advancing the methodology. Implement. Sci. 5, 69. doi:10.1186/1748-5908-5-69

PubMed Abstract | CrossRef Full Text | Google Scholar

Likert, R. (1932). A technique for the measurement of attitudes. U.M.I. New York, United States: Columbia University Press.

Google Scholar

Lima, M. W., Gupta, M., Rodriguez y Baena, F., Barnaghi, P., Sharp, D. J., Vaidyanathan, R., et al. (2022). Conversational affective social robots for ageing and dementia support. IEEE Trans. Cognitive Dev. Syst. 14 (4), 1378–1397. doi:10.1109/TCDS.2021.3115228

CrossRef Full Text | Google Scholar

Liu, M., Wang, C., and Hu, J. (2023). Older adults’ intention to use voice assistants: usability and emotional needs. Heliyon 9 (11), e21932. doi:10.1016/j.heliyon.2023.e21932

PubMed Abstract | CrossRef Full Text | Google Scholar

Lubold, N., Walker, E., and Pon-Barry, H. (2016). Effects of voice-adaptation and social dialogue on perceptions of a robotic learning companion. ACM/IEEE Int. Conf. Human-Robot Interact. (HRI), 255–262. Paper presented at the 2016 11th. doi:10.1109/hri.2016.7451760

CrossRef Full Text | Google Scholar

MacDorman, K. F., and Ishiguro, H. (2006). The uncanny advantage of using androids in cognitive and social science research. Interact. Studies.Social Behav. Commun. Biol. Artif. Syst. 7 (3), 297–337. doi:10.1075/is.7.3.03mac

CrossRef Full Text | Google Scholar

Macdorman, K. F., and Minato, T. (1970). The uncanny valley. Energy, 7 (4), 33–35.

Google Scholar

Miklósi, Á., Korondi, P., Matellán, V., and Gácsi, M. (2017). Ethorobotics: a new approach to human-robot relationship. Front. Psychol. 8, 958. doi:10.3389/fpsyg.2017.00958

CrossRef Full Text | Google Scholar

Mishra, N., Ramanathan, M., Tulsulkar, G., and Thalmann, N. M. (2022). Uncanny valley for interactive social agents: an experimental study. Virtual Real. and Intelligent Hardw. 4 (5), 393–405. doi:10.1016/j.vrih.2022.08.003

CrossRef Full Text | Google Scholar

Mori (1970). Bukimi no Tani (the uncanny valley). Energy 7 (4), 33–35.

Google Scholar

Mori, M., MacDorman, K. F., and Kageki, N. (2012). The uncanny valley [from the field]. IEEE Robotics and Automation Mag. 19 (2), 98–100. doi:10.1109/mra.2012.2192811

CrossRef Full Text | Google Scholar

Moyle, W., Jones, C., Murfield, J., Thalib, L., Beattie, E., Shum, D., et al. (2019). Using a therapeutic companion robot for dementia symptoms in long-term care: reflections from a cluster-RCT. Aging and Ment. Health 23 (3), 329–336. doi:10.1080/13607863.2017.1421617

PubMed Abstract | CrossRef Full Text | Google Scholar

Naneva, S., Sarda Gou, M., Webb, T. L., and Prescott, T. J. (2020). A systematic review of attitudes, anxiety, acceptance, and trust towards social robots. Int. J. Soc. Robotics 12 (6), 1179–1201. doi:10.1007/s12369-020-00659-4

CrossRef Full Text | Google Scholar

Nault, E., Baillie, L., and Broz, F. (2024). Socially assistive robots and sensory feedback for engaging older adults in cognitive activities. ACM Trans. Human-Robot Interact. 14 (1), 1–26. doi:10.1145/3698241

CrossRef Full Text | Google Scholar

Nomura, T., Kanda, T., Suzuki, T., and Kato, K. (2005). “Psychology in human-robot communication: an attempt through investigation of negative attitudes and anxiety toward robots,” in Paper presented at the, 35–40. doi:10.1109/roman.2004.1374726

CrossRef Full Text | Google Scholar

Nomura, T., Suzuki, T., Kanda, T., and Kato, K. (2006). Altered attitudes of people toward robots: investigation through the negative attitudes toward robots scale. Paper presented at the Proceedings of the AAAI-06 workshop on human implications of human-robot interaction, 29–35.

Google Scholar

Olatunji, S. A., Shim, J. S., Syed, A., Tsai, Y., Pereira, A. E., Mahajan, H. P., et al. (2025). Robotic support for older adults with cognitive and mobility impairments. Front. Robot. AI 12, 1545733. doi:10.3389/frobt.2025.1545733

CrossRef Full Text | Google Scholar

Ostrowski, A. K., Dipaola, D., Partridge, E., Park, H. W., and Breazeal, C. (2019). Older adults living with social robots: promoting social connectedness in long-term communities. IEEE Robotics and Automation Mag. 26 (2), 59–70. doi:10.1109/mra.2019.2905234

CrossRef Full Text | Google Scholar

Ostrowski, A. K., Zhang, J., Breazeal, C., and Park, H. W. (2024). Promising directions for human-robot interactions defined by older adults. Front. Robot. 11, 1289414. doi:10.3389/frobt.2024.1289414

CrossRef Full Text | Google Scholar

Ouzzani, M., Hammady, H., Fedorowicz, Z., and Elmagarmid, A. (2016). Rayyan—A web and Mobile app for systematic reviews. Syst. Rev. 5 (1), 210. doi:10.1186/s13643-016-0384-4

PubMed Abstract | CrossRef Full Text | Google Scholar

Papadopoulos, I., Koulouglioti, C., Lazzarino, R., and Ali, S. (2020). Enablers and barriers to the implementation of socially assistive humanoid robots in health and social care: a systematic review. BMJ 10, e033096. doi:10.1136/bmjopen-2019-033096

PubMed Abstract | CrossRef Full Text | Google Scholar

Park, E., Jung, A., and Lee, K. (2021). The humanoid robot sil-bot in a cognitive training program for community-dwelling elderly people with mild cognitive impairment during the COVID-19 pandemic: a randomized controlled trial. Int. J. Environ. Res. Public Health 18 (15), 8198. doi:10.3390/ijerph18158198

PubMed Abstract | CrossRef Full Text | Google Scholar

Patel, Y., and Rughani, P. H. (2022). “Can robots harm? A technical survey on robots and their challenges,” in Paper presented at the, 233–238. doi:10.1109/ICRAE56463.2022.10056185

CrossRef Full Text | Google Scholar

Pino, M., Boulay, M., Jouen, F., and Rigaud, A. S. (2015). Are we ready for robots that care for us?” Attitudes and opinions of older adults toward socially assistive robots. Front. Aging Neurosci. 7, 141. doi:10.3389/fnagi.2015.00141

PubMed Abstract | CrossRef Full Text | Google Scholar

Plano Clark, V. L. (2017). Mixed methods research. J. Posit. Psychol. 12 (3), 305–306. doi:10.1080/17439760.2016.1262619

CrossRef Full Text | Google Scholar

Pu, L., Moyle, W., Jones, C., and Todorovic, M. (2019). The effectiveness of social robots for older adults: a systematic review and meta-analysis of randomized controlled studies. Gerontologist 59 (1), e37–e51. doi:10.1093/geront/gny046

PubMed Abstract | CrossRef Full Text | Google Scholar

Rantanen, T., Lehto, P., Vuorinen, P., and Coco, K. (2018). The adoption of care robots in home care—A survey on the attitudes of Finnish home care personnel. J. Clin. Nurs. 27 (9-10), 1846–1859. doi:10.1111/jocn.14355

PubMed Abstract | CrossRef Full Text | Google Scholar

Rigaud, A., Dacunha, S., Harzo, C., Lenoir, H., Sfeir, I., Piccoli, M., et al. (2024). Implementation of socially assistive robots in geriatric care institutions: Healthcare professionals’ perspectives and identification of facilitating factors and barriers. J. Rehabilitation Assistive Technol. Eng. 11, 20556683241284765. doi:10.1177/20556683241284765

PubMed Abstract | CrossRef Full Text | Google Scholar

Robinson, H., MacDonald, B., Kerse, N., and Broadbent, E. (2013). The psychosocial effects of a companion robot: a randomized controlled trial. J. Am. Med. Dir. Assoc. 14 (9), 661–667. doi:10.1016/j.jamda.2013.02.007

PubMed Abstract | CrossRef Full Text | Google Scholar

Robinson, H., Macdonald, B., and Broadbent, E. (2014). The role of healthcare robots for older people at home: a review. Int. J. Soc. Robotics 6 (4), 575–591. doi:10.1007/s12369-014-0242-2

CrossRef Full Text | Google Scholar

Sawik, B., Tobis, S., Baum, E., Suwalska, A., Kropińska, S., Stachnik, K., et al. (2023). Robots for elderly care: review, multi-criteria optimization model and qualitative case study. Paper presented at the healthcare, 11(9) 1286. doi:10.3390/healthcare11091286

PubMed Abstract | CrossRef Full Text | Google Scholar

Sharkey, A., and Sharkey, N. (2012). Granny and the robots: ethical issues in robot care for the elderly. Ethics Inf. Technol. 14, 27–40. doi:10.1007/s10676-010-9234-6

CrossRef Full Text | Google Scholar

Shih, M. T., Lee, Y., Huang, C., and Chan, L. (2023). A feeling of déjà vu: the effects of avatar appearance-similarity on persuasiveness in social virtual reality. Proc. ACM Human-Computer Interact. 7, 1–31. doi:10.1145/3610167

CrossRef Full Text | Google Scholar

Silva, P. (2015). “Davis' technology acceptance model (TAM)(1989),” in Information seeking behavior and technology adoption: theories and trends, 205–219.

Google Scholar

Søraa, R. A., Tøndel, G., Kharas, M. W., and Serrano, J. A. (2022). What do older adults want from social robots? A qualitative research approach to human-robot interaction (HRI) studies. Int. J. Soc. Robotics 15 (3), 411–424. doi:10.1007/s12369-022-00914-w

CrossRef Full Text | Google Scholar

Spatola, N., Kühnlenz, B., and Cheng, G. (2021). Perception and evaluation in human–robot interaction: the human–robot interaction evaluation scale (HRIES)—A multicomponent approach of anthropomorphism. Int. J. Soc. Robotics 13 (7), 1517–1539. doi:10.1007/s12369-020-00667-4

CrossRef Full Text | Google Scholar

Spielberger, C. D., and Gorsuch, R. L. (1983). Manual for the state-trait anxiety inventory (form Y). Palo Alto, CA: Consulting Psychologists Press.

Google Scholar

Stafford, R. Q., Macdonald, B. A., Li, X., and Broadbent, E. (2014). Older people’s prior robot attitudes influence evaluations of a conversational robot. Int. J. Soc. Robotics 6 (2), 281–297. doi:10.1007/s12369-013-0224-9

CrossRef Full Text | Google Scholar

Strutz, N., Perotti, L., Heimann-Steinert, A., and Klebbe, R. (2024). Older adults’ communication with an interactive humanoid robot. Z. Für Gerontol. Und Geriatr. 57 (5), 371–375. doi:10.1007/s00391-023-02268-y

PubMed Abstract | CrossRef Full Text | Google Scholar

Sun, E., and Ye, X. (2024). Older and fearing new technologies? The relationship between older adults’ technophobia and subjective age. Aging and Ment. Health 28 (4), 569–576. doi:10.1080/13607863.2023.2241017

PubMed Abstract | CrossRef Full Text | Google Scholar

Takayanagi, K., Kirita, T., and Shibata, T. (2014). Comparison of verbal and emotional responses of elderly people with mild/moderate dementia and those with severe dementia in responses to seal robot, PARO. Front. Aging Neurosci. 6, 257. doi:10.3389/fnagi.2014.00257

PubMed Abstract | CrossRef Full Text | Google Scholar

Tay, B., Jung, Y., and Park, T. (2014). When stereotypes meet robots: the double-edge sword of robot gender and personality in human–robot interaction. Comput. Hum. Behav. 38, 75–84. doi:10.1016/j.chb.2014.05.014

CrossRef Full Text | Google Scholar

Thunberg, S., Arnelid, M., and Ziemke, T. (2022). Older adults’ perception of the furhat robot, 4, 12. doi:10.1145/3527188.3561924

CrossRef Full Text | Google Scholar

Tobis, S., Piasek, J., Cylkowska-Nowak, M., and Suwalska, A. (2022). Robots in eldercare: how does a real-world interaction with the machine influence the perceptions of older people? Sensors 22 (5), 1717. doi:10.3390/s22051717

PubMed Abstract | CrossRef Full Text | Google Scholar

Torta, E., Werner, F., Johnson, D. O., Juola, J. F., Cuijpers, R. H., Bazzani, M., et al. (2014). Evaluation of a small socially-assistive humanoid robot in intelligent homes for the care of the elderly. J. Intelligent and Robotic Syst. 76, 57–71. doi:10.1007/s10846-013-0019-0

CrossRef Full Text | Google Scholar

Tricco, A. C., Lillie, E., Zarin, W., O'Brien, K. K., Colquhoun, H., Levac, D., et al. (2018). PRISMA extension for scoping reviews (PRISMA-ScR): checklist and explanation. Ann. Intern. Med. 169 (7), 467–473. doi:10.7326/m18-0850

PubMed Abstract | CrossRef Full Text | Google Scholar

Tschöpe, N., Reiser, J. E., and Oehl, M. (2017). “Exploring the uncanny valley effect in social robotics,” in Paper presented at the proceedings of the companion of the 2017 ACM/IEEE international conference on human-robot interaction, 307–308.

Google Scholar

Tulsulkar, G., Mishra, N., Thalmann, N. M., Lim, H. E., Lee, M. P., and Cheng, S. K. (2021). Can a humanoid social robot stimulate the interactivity of cognitively impaired elderly? A thorough study based on computer vision methods. Vis. Comput. 37 (12), 3019–3038. doi:10.1007/s00371-021-02242-y

PubMed Abstract | CrossRef Full Text | Google Scholar

Vandemeulebroucke, T., Dierckx De Casterlé, B., Welbergen, L., Massart, M., and Gastmans, C. (2019). The ethics of socially assistive robots in aged care. A focus group study with older adults in flanders, Belgium. Journals Gerontology Ser. B 75 (9), 1996–2007. doi:10.1093/geronb/gbz070

PubMed Abstract | CrossRef Full Text | Google Scholar

Venkatesh, V., Morris, M. G., Davis, G. B., and Davis, F. D. (2003). User acceptance of information technology: toward a unified view. MIS Q. 27, 425–478. doi:10.2307/30036540

CrossRef Full Text | Google Scholar

Vozna, A., and Costantini, S. (2025). Ethical, legal, and societal dimensions of AI-Driven social robots in elderly healthcare. Intelligenza Artif. Int. J. AIxIA, 17248035241310192. doi:10.1177/17248035241310192

CrossRef Full Text | Google Scholar

Walters, M. L., Syrdal, D. S., Dautenhahn, K., Te Boekhorst, R., and Koay, K. L. (2008). Avoiding the uncanny valley: robot appearance, personality and consistency of behavior in an attention-seeking home scenario for a robot companion. Aut. Robots 24, 159–178. doi:10.1007/s10514-007-9058-3

CrossRef Full Text | Google Scholar

Whelan, S., Murphy, K., Barrett, E., Krusche, C., Santorelli, A., and Casey, D. (2018). Factors affecting the acceptability of social robots by older adults including people with dementia or cognitive impairment: a literature review. Int. J. Soc. Robotics 10 (5), 643–668. doi:10.1007/s12369-018-0471-x

CrossRef Full Text | Google Scholar

World Health Organization (2022). WHO guideline on self-care interventions for health and well-being, 2022 revision. Geneva, Switzerland: World Health Organization.

Google Scholar

Wu, Y., Wrobel, J., Cornuet, M., Kerhervé, H., Damnée, S., and Rigaud, A. (2014). Acceptance of an assistive robot in older adults: a mixed-method study of human–robot interaction over a 1-month period in the living lab setting. Clin. Interventions Aging 9, 801–811. doi:10.2147/cia.s56435

PubMed Abstract | CrossRef Full Text | Google Scholar

Yam, K. C., Tan, T., Jackson, J. C., Shariff, A., and Gray, K. (2023). Cultural differences in people's reactions and applications of robots, algorithms, and artificial intelligence. Manag. Organ. Rev. 19 (5), 859–875. doi:10.1017/mor.2023.21

CrossRef Full Text | Google Scholar

Yamaguchi, M. (2025). Item-level implicit affective measures reveal the uncanny valley of robot faces. Int. J. Human-Computer Stud. 196, 103443. doi:10.1016/j.ijhcs.2024.103443

CrossRef Full Text | Google Scholar

Yuan, Y., Wu, C., Niu, J., and Mao, L. (2024). The effects of human-robot interactions and the human-robot relationship on robot competence, trust, and acceptance. Sage Open 14 (2). doi:10.1177/21582440241248230

CrossRef Full Text | Google Scholar

Zafrani, O., and Nimrod, G. (2018). Towards a holistic approach to studying human–robot interaction in later life. Gerontologist 59 (1), e26–e36. doi:10.1093/geront/gny077

PubMed Abstract | CrossRef Full Text | Google Scholar

Zafrani, O. (2022). Between fear and trust: factors influencing older adults' evaluation of socially assistive robots,

Google Scholar

Zafrani, O., Nimrod, G., and Edan, Y. (2023). Between fear and trust: older adults’ evaluation of socially assistive robots. Int. J. Human-Computer Stud. 171, 102981. doi:10.1016/j.ijhcs.2022.102981

CrossRef Full Text | Google Scholar

Zhao, D., Sun, X., Shan, B., Yang, Z., Yang, J., Liu, H., et al. (2023). Research status of elderly-care robots and safe human-robot interaction methods. Front. Neurosci. 17, 1291682. doi:10.3389/fnins.2023.1291682

PubMed Abstract | CrossRef Full Text | Google Scholar

Złotowski, J. A., Sumioka, H., Nishio, S., Glas, D. F., Bartneck, C., and Ishiguro, H. (2015). Persistence of the uncanny valley: the influence of repeated interactions and a robot's attitude on its perception. Front. Psychol. 6, 883. doi:10.3389/fpsyg.2015.00883

PubMed Abstract | CrossRef Full Text | Google Scholar

Zsiga, K., Tóth, A., Pilissy, T., Péter, O., Dénes, Z., and Fazekas, G. (2018). Evaluation of a companion robot based on field tests with single older adults in their homes. Assist. Technol. 30 (5), 259–266. doi:10.1080/10400435.2017.1322158

PubMed Abstract | CrossRef Full Text | Google Scholar

Keywords: fear of robots, older adults, human-robot interaction, uncanny valley, trust, privacy, anxiety, social robotics

Citation: Elsheikh A, Al-Thani D and Othman A (2025) Exploring fear in human-robot interaction: a scoping review of older adults’ experiences with social robots. Front. Robot. AI 12:1626471. doi: 10.3389/frobt.2025.1626471

Received: 10 May 2025; Accepted: 10 September 2025;
Published: 13 October 2025.

Edited by:

Amit Arora, University of the District of Columbia, United States

Reviewed by:

Deepak Sahoo, Swansea University, United Kingdom
Ritam Ghosh, Vanderbilt University, United States

Copyright © 2025 Elsheikh, Al-Thani and Othman. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Ahmed Elsheikh, YWhlbDQzNTY0QGhia3UuZWR1LnFh

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.