Skip to main content

ORIGINAL RESEARCH article

Front. Sociol., 16 September 2022
Sec. Sociological Theory
Volume 7 - 2022 | https://doi.org/10.3389/fsoc.2022.957246

Algorithmic harms and digital ageism in the use of surveillance technologies in nursing homes

  • 1School of Social Work, University of Washington, Seattle, WA, United States
  • 2Recreation and Leisure Studies, Brock University, St. Catharines, ON, Canada

Ageism has not been centered in scholarship on AI or algorithmic harms despite the ways in which older adults are both digitally marginalized and positioned as targets for surveillance technology and risk mitigation. In this translation paper, we put gerontology into conversation with scholarship on information and data technologies within critical disability, race, and feminist studies and explore algorithmic harms of surveillance technologies on older adults and care workers within nursing homes in the United States and Canada. We start by identifying the limitations of emerging scholarship and public discourse on “digital ageism” that is occupied with the inclusion and representation of older adults in AI or machine learning at the expense of more pressing questions. Focusing on the investment in these technologies in the context of COVID-19 in nursing homes, we draw from critical scholarship on information and data technologies to deeply understand how ageism is implicated in the systemic harms experienced by residents and workers when surveillance technologies are positioned as solutions. We then suggest generative pathways and point to various possible research agendas that could illuminate emergent algorithmic harms and their animating force within nursing homes. In the tradition of critical gerontology, ours is a project of bringing insights from gerontology and age studies to bear on broader work on automation and algorithmic decision-making systems for marginalized groups, and to bring that work to bear on gerontology. This paper illustrates specific ways in which important insights from critical race, disability and feminist studies helps us draw out the power of ageism as a rhetorical and analytical tool. We demonstrate why such engagement is necessary to realize gerontology's capacity to contribute to timely discourse on algorithmic harms and to elevate the issue of ageism for serious engagement across fields concerned with social and economic justice. We begin with nursing homes because they are an understudied, yet socially significant and timely setting in which to understand algorithmic harms. We hope this will contribute to broader efforts to understand and redress harms across sectors and marginalized collectives.

Introduction

Surveillance technologies are adopted within nursing homes to remotely monitor older adults and workers with an interest in improving quality of care, health, and safety (Niemeijer et al., 2015; Vermeer et al., 2019; Mitchell et al., 2020; Lukkien et al., 2021; World Health Organization, 2022). Vermeer et al. (2019) define surveillance technology as comprising “monitoring systems that can allow for 24-h supervision by caregivers” (p. 2). In the context of COVID-19, there has been a push to adopt or repurpose these technologies to assist with digital contact tracing or to support infection and control practices (NAS, 2022). Examples include Real Time Locating Systems (RTLS), fall detection systems, activity sensors, and tracking apps (Grigorovich and Kontos, 2020; Chandonnet, 2021; Fan et al., 2021; Grigorovich et al., 2021). Increasingly, these devices employ algorithms to automate recognition of certain deviations in activities or movements and to trigger responses to pre-programmed parameters (e.g., alert triggered if resident spends too long in one spot as compared to their routine; resident approaches a virtual fence). Such technologies may also involve the application of natural language processing (chatbots, and social robots), or machine learning techniques to process continuously collected information about the movements, location, activities, and physiological state of older adults in their environment (Chandonnet, 2021; Orlov, 2021). There is also increasing interest in expanding the role of artificial intelligence (AI) in nursing homes to process information collected by a full range of technologies, including CCTV cameras, virtual assistants, and electronic health records, and to develop predictive algorithms and automated decision-making systems about care (Wojtusiak et al., 2021; Khan et al., 2022; Zhu et al., 2022). The push to adopt surveillance technologies in nursing homes is thus largely driven by their imagined future benefit for prevention and quality improvement through more timely identification of adverse events and personalization of care in the context of widespread staffing shortages. There has been limited attention to the potential harms of reliance on these technologies to older adults, workers, and others, or on the ways in which they may contribute to ageism.

In this paper, we describe the need for greater attention by gerontology and age studies scholars to bodies of work on algorithmic harms within critical scholarship on information and data technologies where ageism is not yet addressed. The aim of this paper is to bridge this disciplinary gap. Our guiding question in this exploration is, how would engagement with critical data and information scholarship direct critical gerontology research concerned with algorithmically-mediated decision making and AI in nursing homes?

We focus on surveillance technologies in the nursing home context because it has been made politically urgent by COVID-19 and is a generative inroad to deepening analyses of what has been termed “digital ageism” (Manor and Herscovici, 2021; Chu et al., 2022). We use the terms AI and algorithmic as shorthand to reflect the terms popularly used, which for the purpose of this paper reference an assemblage of technologies that support algorithmically mediated decision making, including technologies like sensors that are not properly AI but that will create the data for machine learning and automated decision making in the future. We discuss harms as systemic discrimination resulting from the use of surveillance technologies to enhance the collection and production of “actionable intelligence or guidance…that result in the reinforcement or exacerbation of inequality within society” (Gandy, 2010, p. 3). Such decisions are based on value-laden classifications that can entrench unequal relationships. Examples of algorithmic harms include wrongful arrests, employment exclusion, unfair allocation of publicly funded care, reputational harm, misrecognition, and financial, emotional, and psychological harms (Redden and Brand, 2020; Moss et al., 2021; Malik et al., 2022). Harms have been categorized by level at which they occur, including individual harm, community harm, and social harm (Smuha, 2021). Identification of algorithmic harms is ongoing (Redden and Brand, 2020), and taxonomies are expanding as they include “an ever-widening circle of consequences” to individuals, their families, and other connected individuals (Moss et al., 2021, p. 6). Algorithmic harms can be intentional and unintentional (Redden and Brand, 2020).

We do not offer a taxonomy of technologies or algorithmic harms in nursing homes because more research is needed to document the specific applications of AI in nursing homes and their implications. While there are a significant number of surveillance technologies marketed to nursing homes, we still lack a clear picture of how these are incorporated at the organizational level. This inattention to what is currently used in facilities is part of the problem. To begin to map future directions for research on algorithmic harms, we draw insights from the body of research on surveillance technologies in elder care that are incorporated in algorithmically mediated decision making and the limited work on their use within nursing homes.

Nursing home context

The nursing home is a central challenging site for gerontology and age studies. Widely considered an undesirable place to reside, nursing homes are the pivotal event horizon in conceptualizations of the fourth age, existing in the social imaginary as a form of social death and casting a foreboding shadow on older adults striving to achieve third age functionality (Higgs and Gilleard, 2014). Yet a sizable minority of people residing in nursing homes are “low-care residents” who are considered able to live in a less restrictive environment (Mor et al., 2007; Thomas and Mor, 2013). About two-thirds of U.S. nursing home residents are covered by Medicaid, and states are required to provide nursing home care but not home and community-based services under Medicaid. In Canada, nursing and personal care services in nursing homes are publicly funded, but residents typically pay means-tested monthly accommodation payments, and other types of supported living alternatives like retirement homes or memory homes for people living with dementia are paid for entirely out of pocket. This institutional bias contributes to the high waiting lists for state-funded in-home long-term services and supports (Grabowski, 2021). Less institutional options like assisted living and continuing care retirement communities remain financially and locationally inaccessible to many, particularly poor and Black older adults (Jenkins Morales and Robert, 2020; Sloane et al., 2021).

In the U.S., nursing homes that can admit private-pay residents do so to avoid Medicaid's lower reimbursement rates (Sloane et al., 2021), and they remain highly racially segregated despite recent reductions in the proportion of white residents and significant increases in Black and Hispanic residents (Feng et al., 2011). Both Black and Hispanic nursing home residents are more likely than white residents to live in facilities with low staffing ratios, high inspection problems, lower revenue (% Medicaid & occupancy), and in those that may be slated to be closed (Mor et al., 2004; Smith et al., 2008; Fennell et al., 2010; Li et al., 2015). Largely due to their segregation into lower quality facilities, Black nursing home residents as compared with their white counterparts report lower quality of life (Shippee et al., 2020), are more often physically restrained (Cassie and Cassie, 2013), and receive pain treatment less often (Mack et al., 2018). While race, ethnicity, and Indigeneity information is not available for Canadian nursing homes, there is some evidence of similar trends; for example, people wait longer for placement for a basic bed (vs. a private one), and those for whom neither English nor French is their first language experience longer wait times (Flanagan et al., 2021).

Most of the direct care of nursing home residents in the U.S. and Canada is provided by immigrant and racially marginalized women, specifically Black, Latina, and Asian American and Pacific Islander women (Wagner et al., 2021). These workers are often poorly paid (Estabrooks et al., 2020; Scales, 2022), insecurely employed, and not unionized (Sojourner et al., 2010; Zagrodney and Saks, 2017). Many leave this work, unable to envision life-long careers there (Brannon et al., 2007). Nursing home residents tend to be very old (more than half in Canada are over 85 years old), disabled, and living with chronic conditions; however, a sizable minority−17% and 7% in the U.S. and Canada, respectively—are younger than 65 (U. S. Census Bureau, 2018; OLTCA, 2019). But unlike other movements to deinstitutionalize disabled people, no large-scale abolition movement for nursing homes has developed, nor have gerontologists or resident advocates forcefully called for one (Herron et al., 2021). The nursing home plays an outsized role in the social imaginary and in ideas of what old age entails, and yet it is a site that seems to impoverish our imagination for how it could be otherwise.

“Digital ageism” and the digital push response to COVID-19

COVID-19 is said to have created a “perfect storm” in the crisis of nursing homes infection spread and deaths (Konetzka et al., 2021). Just prior to the pandemic, U.S. domain experts identified a range of specific ongoing problems in the industry and concluded that “the current widespread violations of international covenants and conventions and domestic laws and regulations should be considered a national emergency, and government plans should be put in place to address the urgent human rights crisis in nursing home care” (Harrington et al., 2019, p. 68). The COVID-19 crisis drew public attention to myriad problems within nursing homes, while at the same time further entrenching them as a site of failure, social abandonment, and isolation. About one third of COVID-19 fatalities in the U.S. have occurred in long-term care facilities—the vast majority nursing homes—while less than 0.5% of the U.S. population live in nursing homes (The New York Times, 2021). In Canada, nursing home residents account for less than 1% of the population but 43% of COVID-19 deaths (CIHI, 2021a). Numerous op-eds and policy research have highlighted inadequate staffing, regulatory and infection control problems, and vulnerabilities inherent in congregate living (Armstrong et al., 2020; Werner et al., 2020). But public attention quickly shifted away from demand for structural change and permanent policy responses are still forthcoming.

A response that has not faded is the “digital push” in nursing homes (Gallistl et al., 2021). The ways in which COVID-19 laid bare inadequate connective technology infrastructure in this setting has triggered a renewed call to address the digital divide. For example, incorporation of Electronic Health Records in U.S. nursing homes lagged considerably behind beneficial use in other sites of healthcare, leaving some residents with incomplete and inaccessible paper records, with negative implications for person-centered and quality care (NAS, 2022). Certainly, the digitally disconnected residential life behind closed nursing home doors where internet or connective devices were inadequate exposed the harmful, isolating reality of the ageism and ableism embedded in this digital divide.

While the digital divide has always been central to the subfield of gerontechnology (Czaja et al., 2001; Charness and Boot, 2009; Rogers and Fisk, 2010), a concept that has emerged and is gaining traction as an outgrowth of it is “digital ageism” (Chu et al., 2022). Chu et al. (2022) define digital ageism as “age bias in technology such as AI” (para 2). Ageism is often conceptualized as negative attitudes and discrimination based on prejudices about age (Butler, 1969; FrameWorks Institute, 2017). The analytics and dominant discourses of ageism have been critiqued for centering whiteness (Herron et al., 2021), a problem that digital ageism scholarship risks reinforcing. Where AI is concerned, ageism is presented as bias and exclusion (from tech design considerations, for example) based on stereotypes (Manor and Herscovici, 2021). Whether the focus is on lack of access or other forms of exclusion, the predominant concern has been the extent to which digital ageism, such as that embedded within the youth-focused tech and design industries, negatively affects the use of technologies by older adults (see Manor and Herscovici, 2021). Ageist exclusion has also been identified in limited or stereotypical representation of old age or older adults themselves within data training sets (Diaz et al., 2018; Taati et al., 2019; Rosales and Fernández-Ardèvol, 2020; Chu et al., 2022). Researchers are thus making the case that concerns of ageism belong in the AI bias literature alongside racism, ableism, and sexism (Rosales and Fernández-Ardèvol, 2020; Stypińska, 2021; Chu et al., 2022). Stypińska (2021) has offered the broadest discussion of how ageism appears in AI, while noting the dearth of theoretical reflection and research on algorithmic harms for older people as a group, such as discriminatory decision making and stereotypically negative representation that entrenches ageism. We agree that the explicit application of the idea of ageism to digital technology, and specifically AI, is overdue. We too are interested in the rhetorical power of ageism, in addition to its analytical use, because to name a phenomenon is to be able to unveil, critique, and change it. Age should be examined alongside other socio-political categories in the AI bias literature, but we argue that the more pressing analysis takes us in a divergent direction from that literature.

We contend that a hard look at the algorithmic harms of surveillance technologies will deepen our understanding of ageism and reframe a research agenda for gerontology in the multigenerational service of older adults and direct care workers. We propose that fields of study and practice concerned with digital ageism (i.e., gerontechnology, age studies, gerontology) would benefit enormously from engaging with critical scholarship on disparate effects of automation and algorithmic decision-making systems that have been advancing, particularly in fields of critical race, feminist and disability studies and activism. In turn, incorporating analysis of ageism could prove generative for these fields. Nascent work on digital ageism and ageism in AI has focused on principles that align with other mainstream critiques of AI, namely resolving the problem that is largely defined as bias (in this case against older adults) by implementing fairness and inclusion (Hoffmann, 2021). That work seeks to address other biases like racial and gender bias in datasets that are linked to problems like unfair hiring, advertising, credit scoring, and risk assessing (Miceli et al., 2022). As Miceli et al. (2022) have argued, bias studies keep the problem within the realm of data and technology, which “obscures its root causes” and power asymmetries (p. 2). Inclusion promises to resolve the bias problem through and within information and data technologies. That is, data harms are said to be caused and resolved by the same technologies (Hoffmann, 2020b). In this way, expertise is allocated to the tech sphere, and the norms, social hierarchies, and asymmetrical vulnerabilities entrenched by the use of surveillance technologies, and AI specifically, are left unquestioned (Hoffmann, 2020b). For example, in her analysis of the ways in which poverty is managed through surveillance technologies, Eubanks (2018) has shown how inequality is automated through algorithmic decision making regarding public housing, Medicaid, and child welfare, with inordinate surveillance of those living in poverty. She thus argues that where “systems engineering approaches to social problems” are at play, impact must be front and center, regardless of intention.

Much of the research and current efforts to get aging on the radar of AI bias research is uninformed by critiques of ethics washing or the gold standard of fairness and inclusion that have been central to critiques of AI within aforementioned fields (Greene et al., 2019; Hoffmann, 2019; Green, 2021). Like the larger bias literature, the digital ageism concept is shaping up to offer critiques of bias and exclusion in AI and imply that the answer can be found in some combination of representations of old age and older adults that are deemed less biased to data training sets and in, essentially, tweaking algorithms. Desire for inclusion is taken as a given and paired tightly with calls to bridge the digital divide. The scope and nature of the problems introduced by AI are confined to AI and/or articulated on AI's terms, rather than understanding how the introduction of AI technologies reorganize subjects and relationships regardless of how nominally fair or inclusive they are. We argue that there are more elements of ageism at play that are obscured by calls for inclusion.

Critical scholarship on information and data technologies understands data technologies as a structuring force, which digital ageism discourse has not thus far articulated. Because data are never neutral or without ethical salience, digital ageism cannot be reduced to a consequence of “good” or “bad” design, training sets, or intended use. Our focus on the nursing home reveals how digital ageism discourse is not attending to the ways in which AI can retrench power hierarchies that are rooted in racism, ableism, sexism, and classism, foregrounding promise of imagined future benefit that forestalls critical engagements with structural, racist logics. These limitations are elucidated through engagement with critical scholarship on data technologies offered in fields that gerontology could be better informed by. We begin with an overview of insights from critical disability studies, critical race studies, and feminist studies. After presenting a limited overview of each, we discuss some of many potential paths of future study that are illuminated by this critical scholarship. We conclude by discussing the need to understand and articulate the social harms of AI in relation to the increasing adoption of surveillance technologies within nursing homes.

Where can we look for critical insights on algorithmic harms?

Critical disability studies

Disability studies scholars and activists have demonstrated how the introduction of automated algorithmic decision making in education, employment, and insurance promotes discrimination against embodied difference in ways of communicating, interacting, and being. For example, automated online proctoring has unfairly penalized people with disabilities through algorithms that flag some gestures and eye movements as “abnormal” or “unacceptable” exam behavior (Coghlan et al., 2021). Similarly, the use of HireVue and similar facial recognition technologies for automation of interview selection has been shown to discriminate against autistic and neurodivergent people for recommendations for job interviews (Whittaker et al., 2019). Large corporations employ automated decision making in hiring at least in part because it offers an efficient means for screening out those “deemed inefficient, non-productive, and likely to require extra help and support” (Whittaker et al., 2019). Such harms cannot be overcome through better or more data due to the diversity of disability and because the basic logic of AI is to find patterns and form groups to create simple models where outliers from the average are treated as “noise” and disregarded (Trewin, 2018; Whittaker et al., 2019).

Disability studies scholars have thus argued that efforts to reduce algorithmic harms must extend beyond the data collected and the technological design to interrogate “techno-ableism” that drives the development of AI and other technologies for detecting, curing, normalizing or modifying disability (Shew, 2020, p. 4). Connecting this to earlier critiques of medical and technological interventions on disabled lives (Davis, 2003; Garland-Thompson, 2006; Hamraie and Fritsch, 2019), these scholars demonstrate how AI technologies similarly rely on the assumption that we need to “overcome” disability through technology rather than address the structural causes of inaccessibility and discrimination experienced by disabled people. While AI and surveillance technologies are promoted as empowering or enhancing access for disabled people, these can further pathologize and retrench their unequal access to material goods. Moreover, they often amplify disabled peoples' exposure to invasive surveillance that requires intimate data sharing with for-profit companies. As an example, the growing interest in surveillance technologies and AI within health and rehabilitation sectors to “normalize, reprogram, or extinguish autistic traits” is a form “tech-saviorism” that reflects eugenic logics (Williams and Gilbert, 2020, p. 4). These technologies are used to classify and predict behavior for the purpose of coding, analysis, and interventions aimed at modifying the behaviors of autistic individuals (Williams, 2019; Williams and Gilbert, 2020). Rather than addressing the structural causes of “challenging behaviors,” including emotional or sensory distress from being forced to conform to neurotypical ideals, these technologies reproduce ableism by centering ability as the main goal of disability design and prioritizing the interests of caregivers and corporations (Shew, 2020).

Critical race studies

Critical race studies scholars have demonstrated that deployment of automated algorithmic decision-making in social media, policing, education, social welfare, and employment re(produce) racial disparities in accuracy and impact for Black, Indigenous, and People of Color (BIPOC). They counter the assumption that these forms of racial bias are accidental, or merely a reflection of the dataset, by demonstrating that both the data and the technologies are built on assumptions of racial difference (that is, assumptions about race, risk, and value) that amplify racial hierarchies because racism is both desirable and profitable. For example, Noble (2018) has identified racial profiling by for-profit search engines that privilege whiteness and discriminates against women of color who are represented in erroneous, stereotypical and pornographic ways because this reflects the commercial interests that drive how information is categorized and analyzed. Similarly, a study by Obermeyer et al. (2019) showed that a widely used algorithm for selecting patients for access to healthcare programs was less likely to identify Black patients because it relied on the predictive utility of an individual's previous health expenses. This algorithm reproduced racism by using a metric that was insensitive to the structural reasons why Black patients had lower health care costs, such as unequal access to health care. Writing about this study, Benjamin (2019a) has further argued that while Obermeyer's analysis identified a harmful bias, it also points to the limits of solely targeting bias in analyses of individual algorithmic harms rather than focusing on the broader structural inequity that is produced by institutions within which such technologies are created and valued.

Using the concept of “coded exposure” Benjamin (2019b) has similarly suggested that facial recognition technologies produce distinct experiences of visibility that offer white people the privilege of privacy and reduced unwanted exposure to carceral surveilling institutions, such as in the context of facial recognition systems used for predictive policing and asylum and immigration. Benjamin problematizes efforts to develop more inclusive representative algorithms that are better able to recognize and distinguish non-white faces by arguing that as long as these technologies are deployed for carceral purposes, doing so will further forcibly expose BIPOC to government sanctioned containment and exploitation. This is especially so given the carceral expansion of the use of such technologies far beyond their original locations of prisons and borders into public spaces and private homes. Even technologies that are developed to redress or overcome racial biases in datasets and analytical processes—what Benjamin terms “technological beneficence”—end up reproducing or deepening discrimination because they are based on narrowly defined idea of fairness as a technical problem that is produced by individual intentions to do good. The aim to create race neutral technologies assumes that racism only exists in the meaning and tasks given to data and technologies by humans and can thus be prevented through better data and design (e.g., less racist engineers/companies). Such approaches are doomed to fail as they ignore neoliberal political economy and capitalist relations that underpin the development and deployment of AI-based technologies and sustain relations of white supremacy (Benjamin, 2019b). Benjamin and other critical race science, technology, and society (STS) scholars thus advocate for a refusal of technological solutionism that is rooted in abolitionist and solidarity politics with a focus on understanding and exposing intersecting algorithmic harms in order to imagine more liberating and equitable alternatives (Benjamin et al., 2019).

Critical feminist studies

The implications of feminist analyses for understanding ageism in relation to the use of surveillance technology in nursing homes extend beyond comparative insights about people with different gender identities. Comparative insights reveal, for example, that women compared with men experience more data privacy risks and concerns, which are explained by differential experiences with and perceptions of sexual data leakage (Lageson et al., 2019), exploitation, and other online abuses (i.e., sexual harassment, doxing, stalking) (Bartel Sheehan, 1999; Messing et al., 2020). These explanations based on differential vulnerabilities to violence are worthwhile pursuits. They may be particularly worthwhile for aging studies considering gendered and racialized care labor and the fact that the number of men to women sharply declines at older ages, leaving far more women in the 85+ age category and short on care resources. But, as Hoffmann (2021) explains, “recognizing, visualizing, and caring for difference is ultimately insufficient if it leaves dominant logics and structures intact…” (p. 2). She recommends refusal as a universal feminist value—one of “commitment to declining the dominant social, political, or economic terms on offer” that here includes rejection of “the current terms of inclusion,” refusal of “empty calls for fairness and inclusion,” and of hollow, unmoved nods to difference and resistance (Hoffmann, 2021, p. 2–3). As Garcia et al. (2020) elaborate, “critical refusal is a generative concept for challenging harmful data practices, while simultaneously negotiating and developing alternative actions” (p. 2). This includes a consideration of “when to stop collecting data that does not support the rights of communities and when to stop building systems that introduce disproportionate risk and harm” (p. 2). Such a stance is uneasy with solutions of inclusive design as it is concerned with the ways in which enhanced visibility through data also intensifies vulnerability (Hoffmann, 2021). Feminist critical scholarship thus unveils the power concealed within data science practices and discourses. Creators of the Feminist Data Manifest-No articulate the value of practicing refusal (Cifor et al., 2019) with a bold and clear-eyed view that data science's diversity and inclusion are, in fact, roadblocks to liberation from the harms of social hierarchy.

How would engagement with this critical scholarship direct our research on nursing homes?

This critical work has in common a refusal to accept technocratic solutions that lack evidence of efficacy and assessment of intersecting structural harms imposed on marginalized groups. What can thinking with insights from these fields addressing algorithmic harms do for gerontology? The concept of an interventional approach from disability studies (Davis, 2003; Hamraie and Fritsch, 2019; Williams, 2019; Shew, 2020) is already partially embedded in the STS-oriented subfield of socio-gerontechnology where a critique of an “interventionist logic” that dominates gerontechnology has been developed (Neven and Peine, 2017; Peine and Neven, 2018). By naming the interventionist logic, Neven and Peine (2017) and Peine and Neven (2018) make explicit some theoretical assumptions that have long been implicit in the design and research of technologies for older adults. They explain how gerontechnology has been guided by this logic in which aging is conceptualized as a target for biomedical and technological interventions—a set of problems to be solved. In a linear manner, a problem is identified, a user imagined, a technology designed to address it, which is then implemented with real older adults who must match the imagined user profile and use the technology in the way prescribed or scripted. Its impact is then evaluated using pre-defined outcomes. They argue that in fact socio-technical relations and practices are circular: aging, old age, older adults and technology are co-constituted (Peine and Neven, 2021). This STS-aligned intervention on the interventionist logic represents a welcome lens because, with some exceptions, gerontechnology has been active or complicit in framing older adults as in need of technological intervention in the form of datification for risk assessment. Research drawing on critical gerontological analysis of the tenacious normativity in models of aging (Katz and Marshall, 2004; Katz and Calasanti, 2015) has interrogated the interventionist logic and its implications. It has outlined dimensions of power relations involved in various types of data practices (Dalmer et al., 2022), their normalizing function in the service of biomedicine (Joyce and Loe, 2010; Katz and Marshall, 2018), and the meaning of the datafication of care in the context of austerity (Joyce and Loe, 2010; Mort et al., 2013; Sousa, 2013). Below we outline four broad moves for generative engagement with critical disability, race and feminist scholarship on data technologies to further expand the interdisciplinary lens of critical gerontology and enrich our analyses of surveillance technology, AI, and algorithmically mediated decision making in elder care. We provide some of many possible responses to the question, how would engagement with this critical scholarship direct our research on nursing homes?

We would take surveillance and control seriously

Higgs and Gilleard (2021) describe an asymmetrical bifurcation in which technologies for older adults are often directed “toward those least able to exercise control but who are most susceptible to being controlled by others” (p. 1). Surveillance is particularly common in nursing homes and assisted living given that these are “total institutions” where the daily life and activities of residents are scheduled and monitored by care workers (Frik et al., 2019). While there is limited research on the impact of surveillance technologies for COVID-19 on residents, research pre-COVID-19 on these technologies suggests that these are most often used to monitor residents' behavior in ways that reinforce differential power relations between them and care workers and administration. For example, it is assumed that surveillance technologies used to locate and track the whereabouts of people living with dementia enhance their autonomy, independence, and safety by enabling a greater freedom of movement and the power to do what one wants (Wigg, 2010; Niemeijer et al., 2015; Ienca et al., 2018). Yet, there is little evidence of this benefit being achieved, and research suggests that these technologies may sometimes be used to supplement physical and environmental restraints (e.g., locking doors at night, bed restraints) (Wigg, 2010; Zwijsen et al., 2012; Timmons et al., 2019). The surveillance and confinement of residents through the physical environment is defended as necessary to protect the physical safety of “wandering” or “exit seeking” residents who are deemed as being so mobile that they interfere with provision of care or are in danger of getting lost (Wigg, 2010; Tufford et al., 2017; Steele et al., 2020; Graham, 2021). This is despite the fact that some people living with dementia perceive their “wandering” as meaningful and enjoyable (Adekoya and Guse, 2019) and would prefer to have freedom of movement and access to the outdoors (Steele et al., 2020). There is also ample research that demonstrates the negative impact of confinement on older adults, workers, and others, including confinement related to COVID-19 (Borges-Machado et al., 2020; Steele et al., 2020; Graham, 2021). While surveillance technology can be used on an as needed basis to locate an individual who becomes lost or to enable supervised movement (e.g., escorting the individual; Wigg, 2010), a more common finding is that surveillance is more widespread and used to supplement rather than to replace confinement and use of restraints (Grigorovich et al., 2021).

Assessment of risk through automated algorithmic decision making in resource-limited care environments is particularly valued as a strategy to reduce reliance on, or time spent on in-person monitoring (Grigorovich and Kontos, 2020; Ho, 2020). In this context, risk is imagined as a neutral and objective entity that is waiting to be found, rather than understanding the data collected and determined to signify risk as a structuring force. Identification of risk in this context can be based on ageist, ableist and other stigmatizing norms. Consider the meaning and impact of aggressive actions and movements by people living with dementia that are assumed to be disease-driven behavioral symptoms (Grigorovich et al., 2019; Grigorovich and Kontos, 2020). These symptoms are described as a major threat to quality of life and delivery of care in nursing homes, and there is growing interest in the development of predictive algorithms and surveillance technologies “to more immediately and accurately diagnose and manage” them (Husebo et al., 2020, p. 8). While such technologies may be accurate in distinguishing between individuals who are or are not aggressive, they cannot identify those who will actually exhibit aggression in the future. Moreover, such approaches are driven by the assumption that aggression is primarily the result of abnormal physiological or biochemical processes in the brain, and that identifying individual predictors or “triggers” of it can enhance providers' abilities to promptly defuse and redirect it. What is left unaddressed in such technological solutions are the structural influences of aggression (e.g., overcrowding, boredom, low provider-to-resident ratios, limited provider autonomy, and heavy workloads), which cannot be measured or addressed using surveillance technology or automated algorithmic decision-making. As we discuss further in our discussion of opportunity costs, the use of surveillance technologies in this way could distract from these structural problems and retrench professional and policy strategies that focus on restricting the freedom of individuals using behavioral modification, drugs, or other restraints with the intent to protect others from harm.

Surveillance technologies thus pose a disproportionate risk to residents' self-determination by characterizing them as threatening to others and by extending the power of care workers and administration to control their lives (Kenner, 2008; Shore, 2021). This is especially likely when algorithmic predictions counter the preference of a resident or conflict with their own assessment of risk associated with a given behavior (Ho, 2020). In this situation, both automation bias and the reinforcement of recommendations enabled through surveillance technologies could intensify unequal power dynamics between older adults and familial and professional caregivers (Ho, 2020).

For example, it is possible that the use of surveillance technologies for digital contact tracing in the context of the COVID-19 pandemic could contribute to greater use of physical or environment restraints in an effort to restrict the movements of infected residents who are determined unable to remember to isolate and/or use personal protective equipment (PPE). This is particularly concerning given the severe COVID-19-related restrictions that have already been imposed on residents, including their confinement to their room, chronic lockdowns and various other restrictions (Borges-Machado et al., 2020; Heckman et al., 2021). There has been a documented rise in already high use of antipsychotics in nursing homes in the context of COVID-19 (Campitelli et al., 2021), some of which may be related to the justification of using chemical restraints for infection control purposes (Keng et al., 2020). It is reasonable to expect the same rise in restraint as a consequence of digital contact tracing. Understaffing and lack of human resources in this sector drive interest in the development of predictive algorithms for automated decision support with regard to risk of falls or functional decline (Wojtusiak et al., 2021; von Gerich et al., 2022). Harms may be exacerbated if these technologies are also used to inform care planning, including decisions regarding the amount of care received or selection for intervention. Yet, such algorithmic harms are largely unrecognized in gerontechnology research or in tech research more broadly, which tends to reductively focus on informational threats to individual privacy that result from data shared from these technologies with stakeholders outside the circle of care (for example see: Oude Weernink et al., 2018; Bourbonnais et al., 2019).

Nursing home residents' right to privacy is recognized within both U.S. and Canadian law and considered important for residents' quality of life (Koren, 2010; Burack et al., 2012). The promotion of resident privacy as part of the nursing home culture change movement, which, since the 1980s, has motivated models of nursing home reform in both countries and beyond; culture change movements promote the creation of more home-like living environments with care worker and resident empowerment at their core (Koren, 2010; Dupuis et al., 2016). This includes supporting both decisional and physical forms of privacy through environmental design (e.g., private rooms) and residents' and workers' negotiation of schedules, activities, and care tasks (Grabowski, 2021). Privacy is also consistently reported as being important to older adults across a variety of living environments, including nursing homes in relation to the use of surveillance technologies (Garg et al., 2014; Berridge, 2016; Moyle et al., 2020). The potential threat to privacy posed by surveillance technologies may be even more salient for marginalized older adults, including racialized, poor, disabled, or LGBTQ+ older adults. While research in this area is very limited, a recent Australian study of social robots to combat loneliness found that older LGBTIQ+ adults living in the community identified fears of discrimination based on gender identity and sexual orientation and prioritized privacy in relation to surveillance technologies (Poulsen et al., 2020). Engagement with critical scholarship on information and data technologies would lead us to call out the meaning of privacy to older adults' lives and to further conversations regarding whether and how these can be upheld with the use of surveillance technologies. It would move us to take seriously the implications of privacy for non-conformity, freedom of association with others, and self-determination also with attentiveness to marginalized older adults who may have unique experiences and concerns.

Privacy implications are particularly significant in the context of nursing homes as there is ample research to suggest that residents' rights to different types of privacy (Koops et al., 2016) are already rarely upheld in practice. For example, the panopticon open-concept and congregate living design of many offers little opportunity for retreat from other people, other than one's room, which is most often shared with at least one other resident. Even when residents have private rooms, these rarely have door locks, and care workers at times open closed doors without knocking and bar residents from entering their own rooms during the daytime (Tufford et al., 2017). Violation of privacy in these ways is considered acceptable and is consistent with their duty of care to protect residents from harm. Nursing home culture change tenets would position this as the exception to the rule; however, research has found great variability in adoption of culture change practices. Higher rates of adoption are associated with lower proportions of Medicaid residents (Chisholm et al., 2018). The benefits of life in a nursing home that has adopted culture change practices, including respect for privacy, are thus largely restricted to more affluent and white residents. Within this privacy-depleted context where privacy still matters, it is especially important to prevent additional threats to privacy. Even if older adults do not express a desire for privacy or lack awareness of the impact of surveillance technologies on privacy, its protection is nonetheless important to avoid stigmatization and unnecessary restrictions on freedom.

This includes surveillance and control over workers

The expansion of surveillance technologies for contact tracing in nursing homes also has implications for the ways care workers are directed, evaluated, and disciplined. As with residents, interest in these technologies for monitoring workers in nursing homes pre-dates the pandemic and is part of a broader phenomenon of algorithmic management across sectors such as retail, manufacturing, banking, hotels, and call centers (Grigorovich and Kontos, 2020; Wood, 2021). Pre-COVID-19, care workers expressed worry that technologies used to monitor residents could also be used to sanction workers if used to monitor their performance (Bowen et al., 2013; Hall et al., 2019) and that this surveillance could undermine care relationships (Berridge et al., 2019b). Currently, surveillance technologies enhance classification and identification of patterns of movements and interactions with residents, which are interpreted by human managers. However, given the use of similar systems to automate performance evaluation in other sectors, it is likely that these technologies will be further developed to enable similar forms of automated decision-making.

These technologies are imagined as benefiting nursing home workers by enhancing their efficiency and optimizing care processes, including alleviating the burden of administrative tasks and increasing the time they can spend on care and social interaction (Oude Weernink et al., 2018). Much of the research on the impact of these types of technologies on workers has focused on the introduction of digital hailing and platform labor applications for services provided in private homes (Glaser, 2021; McDonald et al., 2021; Williams et al., 2021), including Electronic Visit Verification (EVV) for Medicaid home-based personal assistance services and home health care programs in the U.S. (Gallopyn and Iezzoni, 2020; Mateescu, 2021). Extensively critiqued by disabled activists who formed the Stop EVV campaign in 2017, this surveillance technology involves oversight of care workers (e.g., real-time tracking of arrival and departure times, locations, and activities), and by extension, low-income disabled beneficiaries living in the community. This requirement that is intended to gain efficiency and accountability has now been identified as another state practice that specifically penalizes a significant proportion of the working poor (Eubanks and Mateescu, 2021). Both beneficiaries and workers lack protections to restrict the scope of collected and inferred data to ensure that they are not at increased lifetime risk of punitive interventions from agencies making future use of the data produced about them that may be used to train predictive or other evaluative systems (Hopkins, 2018; Metcalf, 2018; Scalia, 2019). This is an example of how surveillance technologies create vulnerabilities that are not remedied by creating less biased algorithms. Moreover, given the racial dynamics of this workforce (Sloane et al., 2021), such technologies may reinforce racist labor inequalities by extending the power of employers to control low-wage BIPOC workers by undermining their labor rights and protections (van Doorn, 2017; Glaser, 2021; Mateescu, 2021; Williams et al., 2021).

We would incorporate refusal into our analytical lens

Within the bias and fairness discourse in data science, data inclusion is positioned as an imperative, and so it is within emerging work on digital ageism. Inclusion promises to resolve the problem, defined in this case as age discrimination, through and within surveillance technologies. Expertise cannot, under this approach, fall to older adults or care workers who, along with long-term care systems, become dependent on technological interventions. There is also an inconsistency with which the value of expertise is applied. Ageism is readily called out in projections of older adults as technological laggards or as technologically incompetent. Their lack of positioning as experts in their own technological experiences is an acknowledged problem. At the same time, certain reasons that older adults provide through surveys for not adopting technologies like lack of interest or privacy concerns (Baig, 2021) tend to be downplayed (Berridge et al., 2022). Older adults' acts of resistance and rejection, along with reports of intentional non-adoption for reasons other than access or digital literacy barriers, are frequently overlooked in the framing of the problem (see Chu et al., 2022). For example, older adults may, in resisting a given technology, be taking a stance against the devaluation of cultural values like privacy (Barros Pena et al., 2021), but such studies that take seriously and engage the meanings of older adults' preferences and choices that are not tech-positive are quite rare. A contributing factor may be the successful rhetorical power of what Neven and Peine (2017) term the “aging-and-innovation discourse” that positions aging as a grave problem and legitimizes technology investment to solve it. The performative power of this discourse in which society, governments, and older people are all winners when technology is applied to the “problem” of aging (the “triple win”) is morally charged and thus difficult to argue with (Neven and Peine, 2017).

While enhancing autonomy is one of the most cited goals within gerontechnology, assumptions about older adults' values and about their capacities have served as justification to not involve or by-pass them in technology use decisions. A study in low-income senior independent living apartments of the long-term use of a sensor system that used algorithms to issue alerts when one deviates from their routine or stays in the bathroom “too long” identified multiple moments of boundary intrusion (Cohen, 2013), from adoption decision making through use over time (Berridge, 2016, 2017a,b). Techniques employed by building social workers and family members to convince residents (who were not living with dementia) included “revisiting” decisions not to adopt the sensor system, using moralizing discourse in which adoption was presented as the right thing to do and as indication that one wants to take care of oneself, issuing ultimatums (“it's this or a nursing home”), and bypassing the resident by bringing in a family member to decide, in conflict with the organization's own self-determination ethos and policy of independent living (Berridge, 2017a,b). Acceptance of these surveillance technologies was presented as the right option (“I say it's all up to you. How much you value yourself, how much you want to take care of yourself?”) and refusal was deemed “irrational thinking” (Berridge, 2017b, p. 82). Despite the heavy “selling” (social workers' term) of the subsidized system, only two percent of those targeted for use accepted it and 20% of those who had accepted it over the past 12 months had discontinued it. Among users, creative use of this system abounded (Berridge, 2017b). Rejection and what's termed “misuse” of technology by older adults is often dismissed as non-compliance, tech incompetence, or, as Neven (2015) articulates, “initial” vs. real resistance in the face of strong moral discourse embedded in technological innovation targeted on independent living. Using the lens of refusal from critical race and feminist scholarship (Benjamin, 2016; Hoffmann, 2021), we see that many residents in the sensor system study refused the limiting solutions on offer. A social worker who had emigrated from a Russian-speaking country described “socialist thinking” on the part of the significant population of residents from the same region who, across the board, declined offers of the system (Berridge et al., 2019a). Residents she worked with responded with questions like “Why should I have to depend on a piece of plastic for my life?” She recounted how a resident who was able to access in-home health aide services through Medicaid burst into her office and confronted her for the unfair treatment of her neighbor who, as the social worker understood but the resident did not, was near-poor but not yet Medicaid eligible and thus denied access to a home aide. These residents refused the technological solutions on offer and refused acceptance of the U.S.'s stringent means tested long-term care system as one that is just.

That sensor system's owning company has since clarified that they have moved from independent living into higher care settings (Corbyn, 2021). Barriers to refusal are far higher in the nursing home setting than in independent living for a few reasons, including age-graded algorithmic and data flow awareness (Gran et al., 2021). And while heavily regulated relative to other locations of long-term care, this is a location where ageism collides most directly with ideals of self-determination and person-centered care. Installation of sensors and other data-intensive technologies are more likely in all rooms as an institutional investment, for marketing purposes, or the default is an opt-out system of monitoring (Wetsman, 2020). As an example of this, a Canadian province recently invested in a private company that created an RTLS “to help grow its inventory and scale up to reach new markets” with the intent of implementing this system in long-term care homes to enhance safety and quality of care (Atlantic Canada Opportunities Agency, 2021). This demonstrates a growing public policy interest in Canada in implementing such systems in nursing homes as a long-term strategy for surveilling residents, workers, and visitors. Depending on one's location and access to funds, it can be very hard to find a facility, so moving into a nursing home without surveillance technology in use may not be a freely given choice. Objections, including refusals, that may be partially shaped by experiences of gender and sexuality, race, and socioeconomic status, will be important to take seriously when seeking to subject residents to technologies that collect and share private data (Cifor et al., 2019; Poulsen et al., 2020).

While roughly 30–50% of nursing home residents, in the Canada and the US respectively, are not living with dementia (CDC, 2019; CIHI, 2019), dementia heavily impacts residential life. Research has shown that people living with dementia often prefer to be more involved in decision-making than they are (Miller et al., 2016), and their preferences and concerns can go unrepresented (Whitlatch et al., 2005; Harman and Clare, 2006; de Boer et al., 2007; Menne and Whitlatch, 2007). Nudging, appealing to authority, incentivizing, and even deceiving and coercing people who lack decision making capacity in order to overcome their resistance to activities that are deemed helpful for their care are considered necessary practices (Nordgren, 2018). The ethical aspects and ramifications of deceiving people living with dementia in relation to surveillance technology use, from location or activity tracking to AI companions has proven of broad interest, perhaps because of its very sticky nature. Nordgren (2018) cautions that influencing the use of surveillance technologies with people living with dementia should be held to a more restrictive standard than those care practices deemed necessary to the survival, health and hygiene of individuals (e.g., requiring eating and bathing) because these technologies are supplemental and not necessary. They also raise the issue that decisions about tech use are subject to “technological ambitions, commercial opportunities and the wish among high-level decision-makers for cost-effectiveness in the use of limited health care resources” (Nordgren, 2018, p. 416). They recommend that decisions about what level of influence to exert over someone living with dementia be made on a case-by-case basis; however, this is a time-intensive task that nursing homes are ill-equipped to carry out. Deployment of adequate resources to provide person-centered care—arguably the missing piece that drives adoption of surveillance technologies in these settings in the first place—cannot logically be the solution. Alzheimer Europe (2010) published relevant ethics guidelines on surveillance and other types of technologies more than 10 years ago. While they may have a role in advancing the conversation, ethical guidelines which lack enforcement have little chance of making their way into on-the-ground decisions that impact nursing home residents' daily lives.

We would understand and accommodate the limits of a consent model

Critical scholarship on information and data technologies focused on race, gender, and disability makes clear that when we talk about ethics we need to attend to power. How are surveillance technologies amplifying or addressing power imbalances at the structural and interpersonal levels? Hoffmann (2020a) reminds us that “doing ethics will mean attending to histories, dynamics, and relationships that exceed any given tool or technology” (p. 5) and that “ethical debate is not only about values, but about how we account for certain non-ideal facts about the world” (p. 6). Non-ideal facts include the fact that not everyone is capable of giving informed consent or opting out. Oppression is a non-ideal fact visible in structural ageism and ableism. Paternalism characterizes a lot of elder care but so does abandonment. Having limited options to get needed support is certainly disempowering. The point is, while the problem of surveillance technologies in elder care may get framed as a consent issue due to capacity limitations or one of respecting decisional autonomy, it is in fact a bigger problem of power. Following Viljoen (2021), it requires serious engagement with the social and historical context of nursing homes and of other manifestations of ageism and ableism. Informed consent often is not on the table in nursing home settings for a significant number of residents who either lack capacity to consent or are perceived to lack capacity. This environment and this population underscore the need, as Barocas and Nissenbaum (2014) have argued, to understand that informed consent is not the end game. Consenting processes are intended to enable individuals to protect their privacy, but to reduce the concept of privacy to control is to abandon additional questions about information flows that are disrupted or threatening to values that matter to people in this context (Nissenbaum, 2009). If the focus is on achieving consent from a resident or legal guardian, it is unlikely to also be on the potential harms of AI in nursing homes. Consent can serve, in this way, as a loophole to circumvent critical questions about what and whose values a given information flow protects or threatens (Barocas and Nissenbaum, 2014). It is a particularly problematic loophole where roughly half of a nursing home's residents' right to informed consent is ceded to their legal representative (Levy et al., 2018).

The nursing home context presents additional problems for relying on consent. Conversations and policies about consent often do not consider other actors, including workers and visitors, as interested parties in surveillance or data collection practices even when they become its subjects, inadvertently or not (Levy et al., 2018). Constrained choice or pressure to adapt to unfavorable circumstances (Zwijsen et al., 2011) are often-overlooked realities that complicate consent, as is the difficulty, distinct from comprehension, of appreciation for how a given technology will impact someone's daily life (Halpern et al., 2019). Consent should be an ongoing process in which people can change their minds, but this is impractical in a facility with staff shortages where an investment in a given device or surveillance system has already been made.

We would call out the opportunity costs

What are some of the opportunity costs of massive public and private investment in surveillance technologies in resource restricted care environments over attention to the structural causes of vulnerability? Nursing homes became early adopters of digital contact tracing via proximity tracing to mitigate COVID-19 (Laroche, 2020; Wetsman, 2020). Digital contact tracing was presented as a promising response to nursing home infections where nearly 10% of U.S. and nearly 8% of Canadian nursing home residents have died from COVID-19 (CIHI, 2021b; The COVID Tracking Project, 2022). Many of them did not have the ability to say goodbye to loved ones and many have died alone with inadequate staffing that predated the pandemic. Surveillance technologies were promoted to protect essential workers during insufficient access to the vaccine as recently as September of 2021 in one Canadian province despite persistent capping of paid sick days at 3 for COVID-19 (Government of Ontario, 2021; Ontario Health Coalition, 2022). Testing for the virus was slow to arrive to desperate facilities and was disastrous with faulty kits finally sent. Few states in the U.S. required ongoing testing of residents or care workers in the early months of the pandemic. Facilities in the US. and Canada needed, and many are still struggling, to get access to PPE and fast, reliable, frequent testing (Braun et al., 2020; Murray and Friedmann, 2020; Paulin, 2021; Osman and Woolf, 2022). They lack adequately paid workers who don't have to work other jobs that put them at greater infection risk, proper infection control measures, single-occupancy rooms or other resources to overcome logistical problems with quarantining, or resources to manage health conditions without transferring to hospitals that put residents at greater infection risk. The public is asked to imagine, what can technological innovation do for people at risk for COVID-19 in nursing homes? A critical analysis interrogates the disconnect between problem and solution. Our question should be, what are the consequences of being preoccupied with how surveillance technology can save them?

To what extent does the increasing demand for surveillance technologies in nursing homes enable the retreat of governments from care responsibilities? To what extent does it distract attention and resources from known problems, as defined by those most concerned and impacted? If stressors are perceived to be alleviated at a superficial level by these technologies, does that distract from systemic change and further entrench the status quo (Gary, 2021)? The inadequate long-term care workforce—especially the dementia workforce—is an urgent public health problem (Armstrong et al., 2020; APHA, 2021) that has not been met with a proactive policy response. According to the American Public Health Association, workforce challenges “include a limited public health infrastructure to support population-level action, inadequate public and private investments in LTC, disparate access to community-based services and supports, and weak dementia care quality assurance systems.” With COVID-19, the problem has intensified.

The American Health Care Association now reports that about three-fourths of nursing homes have seen a worsening workforce situation during the pandemic, and 94% report a recent staffing shortage (AHCA/NCAL, 2020). A recent study of nearly all U.S. nursing homes found a mean annual turnover rate for certified nursing assistants of 129.1% (Gandhi et al., 2021). Low retention of nurse aides has been linked to poor quality indicators in nursing homes (Castle et al., 2020). The field in the U.S. is overrepresented by women who make up 92% of the nursing aide workforce, and BIPOC workers (57%), with 37% of the total workforce identifying as Black or African American (PHI, 2019), most often working under the supervision of white managers (Sloane et al., 2021). These workers are undervalued in long-term care systems (Mauldin, 2019; Sloane et al., 2021) where pervasive systemic and interpersonal racism harms workers and residents alike (Sloane et al., 2021). While average wages are higher in Canada than in the U.S., in both countries, personal care/nursing aides in nursing homes are poorly remunerated (PHI, 2019; Estabrooks et al., 2020). The needed inputs to stabilize the workforce have been studied over decades and research-based recommendations are laid out clearly (Daly and Szebehely, 2012; Braedley et al., 2017; Scales, 2021, 2022), yet as with other public health problems that have seen surveillance technologies and other AI-based proposed solutions to inadequate public investment, technological fixes are holding court. The question becomes, why are the solutions that get traction serving shareholders and not public health?

It is possible to stop at this level of analysis of what is wrong with nursing homes and to call for policy change to enhance their resources to better train and compensate workers. Yet the critique we have laid out thus far centers on ways in which surveillance technologies would further harm residents and workers, further surveil, normalize, and manage. In other words, further institutionalize them. These surveillance-mediated practices would entrench the problems of the nursing home, making a bad situation worse. And that situation is already worse for BIPOC, low-income, LGBTQ+, and people living with dementia because they are rendered even more vulnerable by nursing homes themselves. As a result of systemic racism, care quality and outcomes are worse in facilities that house primarily Black or Hispanic residents (Mor et al., 2004; Smith et al., 2008; Fennell et al., 2010; Cassie and Cassie, 2013; Li et al., 2015; Mack et al., 2018). It is also documented that many LGBTQ+ older adults are terrified of the notion of moving into a nursing home due to awareness that they may be in greater physical or emotional danger or forced back into the closet (Singleton, 2018; Wilson et al., 2018; Knochel and Flunker, 2021). Receiving care in nursing homes amplifies the dangers to them of ageist transphobia and heteronormativity. And given that people living with dementia are often overridden in decision making when risks are concerned (Stevenson et al., 2019), the introduction of AI in this context could reinforce ableism and further reduce opportunities for self-determination.

To ground analyses of possible algorithmic harms in their context (Hoffmann, 2018) requires that we examine the problem of the nursing home. A critique of the digital push for its power to depoliticize problems focuses us on the political nature of the high rates of infection and deaths from COVID-19. Like prisons, jails, and Immigration and Customs Enforcement (ICE) detention facilities, nursing homes are by their congregate nature locations that cause vulnerability to viruses (Tremain, 2020). Plenty of ageist commentary sprung up that naturalized the deaths of older adults. But despite paltry supports for family caregivers, there was a rush in the beginning of the pandemic of family members discharging nursing home residents to protect their health. Ontario even made decision aids to enable residents and family members to decide about temporarily moving out of long-term care facilities due to COVID-19 (NIA, 2021). In some ways these efforts mirror movement by advocates to get others living in institutions out, including ICE detention centers and prisons.

The danger these institutions place residents and workers in during a pandemic is not the only thing they have in common. Disability studies scholar, Ben-Moshe (2013) proposes that “incarceration is understood as a continuum of carceral edifices, or as an institutional matrix in which disability is a core component” (p. 399). The carceral logic of nursing homes can be spotted in power relations, often embedded in nursing home policies that restrict residents' freedoms and opportunities in the name of management (Tremain, 2021). Whether one agrees with abolition arguments targeted on nursing homes or not, one cannot deny the cogent critiques that motivate it. Yet, abolition remains a very marginalized perspective within the aging space.

Disability studies writers have called gerontologists and aging advocates out for this failure to embrace abolition, stating that it is a problem of their (our) lack of political analysis of incarceration/institutions (Boodman, 2019; Tremain, 2021). We agree with Herron et al. (2021) that this has roots in ageism, and add the general failure in gerontology to meaningfully engage the ways in which racism—both historically and present day—is embedded in and enables exploitative long-term care systems (for exceptions see for example Sloane et al., 2021; Jenkins Morales et al., 2022; Robinson-Lane et al., 2022). Insights on abolition from critical race studies and critical disability studies could lead to questions such as how are surveillance technologies inserted into established power dynamics in nursing homes? Where is the locus of power in the use of these technologies to inform care practices? It does not reside with residents or workers, and so surveillance technologies may further entrench the institutionalization of older adults and exploitation of workers. It will be critical to understand and further articulate how surveillance technologies and their role in automated algorithmic decision-making are in conflict with both abolition and reform approaches like the culture change movement that seeks to deinstitutionalize the nursing home.

The COVID-19 crisis and beyond

Interest in the development of surveillance technologies for nursing homes pre-dates COVID-19. While the implementation of these technologies had largely been on an individual basis (e.g., used with residents who are identified as being most at risk), there was also documented interest in the more ubiquitous use of surveillance technology with all residents to enhance organizational protection from risk and liability (e.g., prevention of injury to residents, defense against allegations of negligence) or as leading to cost savings (e.g., reduction in providers, monitoring provider performance). There remains however a striking lack of evidence to demonstrate that these technologies can predict adverse events and decline in older adults or that they enhance prevention and health promotion. Even in the absence of such evidence, these technologies are seen as valuable for reputation management; that is, for mitigating family members' potential concerns about residents or as protection against complaints and litigation regarding neglect or abuse (Hall et al., 2019). While older adults and workers were largely reluctant to adopt these technologies pre-COVID-19, it is possible that this may shift with more commonplace data collection about one's physical health, including questionnaires, temperature screenings, and COVID-19 tests. Continued use of pandemic tracking apps for general health monitoring and for other types of infectious disease is likely (Seberger and Patil, 2021). Massive public investment in these technologies for workplaces (Government of Ontario, 2021), including nursing homes (Atlantic Canada Opportunities Agency, 2021) and the expansion of public health surveillance systems based on this type of big data in anticipation of future outbreaks further suggests they will remain in use (Miller, 2021).

Discussion: concluding suggestions for focusing our work

Ageism can be a causal factor as well as consequence of an algorithmic harm. Yet ageism is not centered in critical scholarship in fields that have produced pioneering work on the impacts and harms of AI and algorithmically mediated decision making, and gerontology has largely failed to engage that critical work. In this article, we show key ways that critical race, feminist, and disability scholarship and activism can be mobilized to illuminate algorithmic harms as they relate to the problem of ageism. Drawing from this literature, we have explored possible algorithmic harms associated with surveillance, power and control that may fall to older adults and care workers within the U.S. and Canadian nursing home contexts. These harms extend beyond consequences of bias and problems of negative or inadequate representation. We begin with nursing homes, though this work has implications for home care that is also in need of study.

Ageism was deeply ingrained in nursing homes prior to the introduction of surveillance technologies, and together with ableist and racist logics, has shaped the nursing home context and the political environment in which they operate (Boodman, 2019). In centering ageism in conversation about algorithmic harms in the nursing home, we aim to understand the structural power of surveillance technologies and algorithmically mediated decision making by exposing how they serve to further reinforce the interests of neoliberal care regimes and the tech industry. We describe just some of many potential generative paths for further study to understand how surveillance technologies can exacerbate, amplify, and reinforce ageist practices of the nursing home that create vulnerability and reflect disposability. Future work should explore additional areas of critical data studies, including Indigenous and queer data scholarship.

We close with summative thoughts on specific ways in which such engagement could help us to draw out the power of ageism as a rhetorical and analytical tool. First, the COVID-19 crisis makes evident that calling out ageism without calling out racism is a problem, as is considering the harms of surveillance technologies that fall to residents or workers in isolation from each other. Following Herron et al.'s (2021) recent critique of dominant discourses of ageism in general, there is a need to decenter whiteness in scholarship and public discourse on digital ageism by considering the “rights of older people and the value of care work together” (p. 196). This includes attending to the impact of surveillance technologies not only on racialized nursing home residents of all ages but also on the racialized workers who care for them. Engagement with the critical race scholarship on surveillance technology should lead us to ask questions about the impact of these technologies in nursing homes on workers- the majority of whom are BIPOC and immigrant women—working in an environment already characterized by “ambient criminalization” (Mateescu, 2021). v This further includes confronting the role of racism as desirable and profitable within the technocratic solutions sold by the “aging enterprise” (Estes, 1979). For example, public and private investment in surveillance technologies over structural reform could disproportionately expose racialized workers who are already inadequately valued and supported (Estabrooks et al., 2020; Scales, 2022) to stigma and may further undermine their labor rights. This would be particularly of concern within under-resourced nursing homes, where racialized older adults are more likely to live, and could in turn reinforce existing racial disparities in access, process, and outcomes of care for residents.

Taking our lead from Benjamin (Benjamin et al., 2019), we would critically interrogate surveillance technologies with awareness that innovation in the care of older adults does not necessarily represent progress toward more just systems of care or practices. After all, medicating residents into submissive subjects was once considered a progressive innovation that was felt to have solved the problems associated with using physical restraints. Examining these innovations with a critical lens on impact rather than intention (Eubanks, 2018), would shed much-needed light on the harmful side of surveillance technologies in the nursing home, including the ways in which ageism is perpetuated by them.

Engagement with the concept of refusal developed within critical race and feminist studies would mean asking where can people living in nursing homes find opportunities for refusal. It would prepare us to confront unviable contradictions, such as in claims that surveillance technologies are targeted on enhancing older adults' independence where its purpose may actually be safety and risk management that could serve to diminish opportunities for self-determination (Berridge, 2017a). With an understanding of the importance of non-conformity and self-determination for all people, we would take actions to protect the privacy of residents. And with attention to the economically vulnerable status of the majority of nursing home residents, we might ask, how is the best interest of the resident living with dementia assessed given the unbalanced power of countervailing interests?

For non-age studies fields, a focus on ageism and dementia in the nursing home context further illustrates the limits of consent and reliance on autonomy-based control criteria to achieve ethical technology use. In the instance of dementia, capacity to give, refuse, or withdraw consent decreases as one's condition progresses, which could be the very trigger for surveillance technologies (Ho, 2020).

It also suggests that we take one of the pillars of AI ethics—the principle of fairness—further, by asking, is it fair to ask those who are some of the least likely to have the knowledge and experience with data-intensive technologies (Frik et al., 2019) or algorithmic awareness (Gran et al., 2021) to weigh attendant risks to agree to be among the most data-surveilled? This question makes evident the need for more attention among gerontologists to regulation, absent protective regulation both for residents and workers who also tend to be digitally marginalized. As Stypińska (2021) points out, while there is active work on ethical AI regulations and guidelines, the interests of older adults as a collective are thus far unrepresented. More concern with regulation would move the conversation quickly from one of tech ethics and issues of consent to one of power, and specifically of evidence of ageism in protective regulations or lack thereof. What's currently getting attention are the harms to older adults of being screened out through or within AI (Whittaker et al., 2019), but how are people differentially screened into surveillance in a way that prioritizes profit?

Finally, writing to date on digital ageism has focused on addressing individual harms of AI bias and exclusions. To fully flesh out ageism embedded in AI practices, an analysis of the social harms of surveillance technologies and automated algorithmic decision-making in nursing homes is needed. Individual harms are experienced directly by persons and the accumulation of those harms entails collective harm (Smuha, 2021). Social harms extend beyond those harms done to individuals (i.e., the individual nursing home resident) or collectives (residents and workers) who are directly subjected to a given AI application (Smuha, 2021). Social harms associated with surveillance technologies used in nursing homes might accrue to the vast majority of the population—to younger disabled and non-disabled adults who don't reside or work in nursing homes through the foreclosure of future alternatives to institutional surveillance in elder care. The negative consequences of both structural and individual level ageism on older adults' health have been extensively documented (Chang et al., 2020). Because we all are continuously aging and most are living with the expectation to personally experience old age, the negative health effects of ageism accrue very broadly across time. The ageism embedded in nursing homes and the ways that ageism stands to be exacerbated by surveillance technologies deeply impacts one's ideas about old age, including their own, whether one expects to end up in such a facility themselves or not. As Rosales and Fernández-Ardèvol (2020) explain, “Ageism shapes both the image(s) that individuals have of themselves and the image(s) that society has of the different life stages.” The way nursing home residents and workers are treated both reifies and normalizes ideas about old age care, including the idea that one's own older person is not worth the financial cost of person-centered attentive care, but that one would need to be efficiently managed through automated algorithmic decision making. That is, the use of surveillance technologies has the potential to help shape all ages' ideas about their futures, and in turn shape feelings like fear, dread or hope. Social harm is that which entrenches an impoverished imaginary of how ways of relating to each other in spaces of elder care could be otherwise. Refusal as a value is not just a rejection but a claim to the capability to imagine and create new futures (Benjamin et al., 2019).

To the extent that the use of surveillance technologies in nursing homes extinguishes will toward new futures of care, it creates social harm. We echo disability scholar Eva Boodman, who explains the need for no less than a paradigm shift:

nursing homes and institutional environments for older adults have a role in normalizing the idea that our elders are a burden, are objects of medical technology...What this means is that to shift their normalizing power, we will have to target not just the institutions of prisons and nursing homes, but the sets of norms, values, and auxiliary structures that support them and that are continuous with them (Boodman, 2019, p. 16).

We can look to scholarship and activism that embodies a politics of refusal of technocapitalist practices that threaten to entrench aspects of institutionalization within nursing homes that have sparked broad demand for reform, with a minority looking to larger abolition movements. We hope that identifying areas for inquiry to begin to explore algorithmic harms and their animating force within the under-recognized setting of the nursing home will contribute to coalition building and broader efforts to trace, measure, and take action to prevent algorithmic harms across sectors and marginalized collectives.

Data availability statement

The original contributions presented in the study are included in the article/supplementary material, further inquiries can be directed to the corresponding author.

Author contributions

All authors listed have made a substantial, direct, and intellectual contribution to the work and approved it for publication.

Funding

CB was supported by the National Institute on Aging (PI: Berridge, NIA K01AG062681).

Acknowledgments

We thank the two reviewers and Anna Lauren Hoffmann who provided generous, valuable feedback. We are also grateful to the participants of The Social Life of Algorithmic Harms Academic Workshop at Data & Society who reviewed a draft.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher's note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

References

Adekoya, A. A., and Guse, L. (2019). Wandering behavior from the perspectives of older adults with mild to moderate dementia in long-term care. Res. Gerontol. Nurs. 12, 239–247. doi: 10.3928/19404921-20190522-01

PubMed Abstract | CrossRef Full Text | Google Scholar

AHCA/NCAL (2020). State of Nursing Home and Assisted Living Industry: Facing workforce Challenges. Available on at: https://www.ahcancal.org/News-and-Communications/Fact-Sheets/FactSheets/Workforce-Survey-June2020.pdf (accessed May 1, 2022).

Google Scholar

Alzheimer Europe (2010). The Ethical Issues Linked to the Use of Assistive Technology in Dementia Care. Available online at: https://www.alzheimer-europe.org/sites/default/files/alzheimer_europe_ethics_report_2010.pdf (accessed May, 3 2018).

Google Scholar

APHA (2021). Strengthening the Dementia Care Workforce: A Public Health Priority. Available online at: https://www-apha-org.offcampus.lib.washington.edu/policies-and-advocacy/public-health-policy-statements/policy-database/2021/01/13/strengthening-the-dementia-care-workforce (accessed May 1, 2022).

Google Scholar

Armstrong, P., Armstrong, H., Choiniere, J., Lowndes, R., and Struthers, J. (2020). Re-imagining Long-Term Residential Care in the COVID-19 Crisis. Available online at: https://www.policyalternatives.ca/sites/default/files/uploads/publications/National%20Office/2020/04/Reimagining%20residential%20care%20COVID%20crisis.pdf (accessed April 1, 2022).

Google Scholar

Atlantic Canada Opportunities Agency (2021). New Release: Employing Technology for Long-term Care. Goverment of Nova Scotia. Available online at: https://www.canada.ca/en/atlantic-canada-opportunities/news/2021/05/employing-technology-for-long-term-care.html (accessed March 5, 2022).

Google Scholar

Baig, E. (2021). Older Adults Wary About Their Privacy Online: Companies Increase Transparency About Data Collection to Ease Those Concerns. AARP. Available online at: https://www.aarp.org/home-family/personal-technology/info-2021/companies-address-online-privacy-concerns.html (accessed April 3, 2022).

Google Scholar

Barocas, S., and Nissenbaum, H. (2014). “Big Data's end run around anonymity and consent,” in Privacy, Big Data, and the Public Good: Frameworks for Engagement, eds H. Nissenbaum, J. Lane, S. Bender, and V. Stodden (Cambridge: Cambridge University Press), 44–75. doi: 10.1017/CBO9781107590205.004

CrossRef Full Text | Google Scholar

Barros Pena, B., Clarke, R. E., Holmquist, L. E., and Vines, J. (2021). “Circumspect users: older adults as critical adopters and resistors of technology,” in Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (Yokohama), 1–14. doi: 10.1145/3411764.3445128

CrossRef Full Text | Google Scholar

Bartel Sheehan, K. (1999). An investigation of gender differences in on-line privacy concerns and resultant behaviors. J. Interact. Mark. 13, 24–38

Google Scholar

Benjamin, R. (2016). Informed refusal: toward a justice-based bioethics. Sci. Technol. Hum. Values 41, 967–990. doi: 10.1177/0162243916656059

CrossRef Full Text | Google Scholar

Benjamin, R. (2019a). Assessing risk, automating racism. Science 366, 421–422. doi: 10.1126/science.aaz3873

PubMed Abstract | CrossRef Full Text | Google Scholar

Benjamin, R. (2019b). Race After Technology: Abolitionist Tools for the New Jim Code. Cambridge, UK: Polity Press. doi: 10.1093/sf/soz162

CrossRef Full Text | Google Scholar

Benjamin, R., Duster, T., Eglash, R., Gaskins, N., Hatch, A. R., Miller, A., et al. (2019). Captivating Technology. Durham: Duke University Press.

Google Scholar

Ben-Moshe, L. (2013). Disabling incarceration: connecting disability to divergent confinements in the USA. Crit. Sociol. 39, 385–403. doi: 10.1177/0896920511430864

CrossRef Full Text | Google Scholar

Berridge, C. (2016). Breathing room in monitored space: the impact of passive monitoring technology on privacy in independent living. Gerontologist 56, 807–816. doi: 10.1093/geront/gnv034

PubMed Abstract | CrossRef Full Text | Google Scholar

Berridge, C. (2017a). Active subjects of passive monitoring: responses to a passive monitoring system in low-income independent living. Ageing Soc. 37, 537–560. doi: 10.1017/S0144686X15001269

PubMed Abstract | CrossRef Full Text | Google Scholar

Berridge, C. (2017b). “Selling passive monitoring to manage risk in independent living: Frontline workers in a bind,” in Under Observation: The Interplay Between eHealth and Surveillance, eds S. Adams, N. Purtova, and R. Leenes (Cham: Springer Publishing), 73–90. doi: 10.1007/978-3-319-48342-9_5

CrossRef Full Text | Google Scholar

Berridge, C., Chan, K. T., and Choi, Y. (2019a). Sensor-based passive remote monitoring and discordant values: Qualitative study of the experiences of low-income immigrant elders in the United States. JMIR Mhealth Uhealth 7, e11516. doi: 10.2196/11516

PubMed Abstract | CrossRef Full Text | Google Scholar

Berridge, C., Halpern, J., and Levy, K. (2019b). Cameras on beds: the ethics of surveillance in nursing home rooms. Am. J. Bioethics 10, 55–62. doi: 10.1080/23294515.2019.1568320

PubMed Abstract | CrossRef Full Text | Google Scholar

Berridge, C., Zhou, Y., Lazar, A., Porwal, A., Mattek, N., Gothard, S., et al. (2022). “Control matters in elder care technology: evidence and direction for designing it in,” in Designing Interactive Systems Conference. (Virtual Event (Australia) and New York, USA). doi: 10.1145/3532106.3533471

PubMed Abstract | CrossRef Full Text | Google Scholar

Boodman, E. (2019). Nursing home abolition: prisons and the institutionalization of older adult care. J. Ethical Urban Living 2, 1–21. Available online at: http://jeul.cognethic.org/jeulv2i1.pdf#page=4

Google Scholar

Borges-Machado, F., Barros, D., Ribeiro, Ó., and Carvalho, J. (2020). The effects of COVID-19 home confinement in dementia care: physical and cognitive decline, severe neuropsychiatric symptoms and increased caregiving burden. Am. J. Alzheimer. Dis. Other Demen. 35, 1533317520976720. doi: 10.1177/1533317520976720

PubMed Abstract | CrossRef Full Text | Google Scholar

Bourbonnais, A., Rousseau, J., Lalonde, M. H., Meunier, J., Lapierre, N., and Gagnon, M. P. (2019). Conditions and ethical challenges that could influence the implementation of technologies in nursing homes: a qualitative study. Int. J. Older People Nurs. 14, e12266. doi: 10.1111/opn.12266

PubMed Abstract | CrossRef Full Text | Google Scholar

Bowen, M. E., Wingrave, C. A., Klanchar, A., and Craighead, J. (2013). Tracking technology: lessons learned in two health care sites. Technol. Health Care 21, 191–197. doi: 10.3233/THC-130738

PubMed Abstract | CrossRef Full Text | Google Scholar

Braedley, S., Owusu, P., Przednowek, A., and Armstrong, P. (2017). We're told,‘Suck it up': long-term care workers' psychological health and safety. Ageing Int. 43, 91–109. doi: 10.1007/s12126-017-9288-4

CrossRef Full Text | Google Scholar

Brannon, D., Barry, T., Kemper, P., Schreiner, A., and Vasey, J. (2007). Job perceptions and intent to leave among direct care workers: evidence from the better jobs better care demonstrations. Gerontologist 47, 820–829. doi: 10.1093/geront/47.6.820

PubMed Abstract | CrossRef Full Text | Google Scholar

Braun, R. T., Yun, H., Casalino, L. P., Myslinski, Z., Kuwonza, F. M., Jung, H.-Y., et al. (2020). Comparative performance of private equity–owned US nursing homes during the COVID-19 pandemic. JAMA Netw. Open 3, 1–11. doi: 10.1001/jamanetworkopen.2020.26702

PubMed Abstract | CrossRef Full Text | Google Scholar

Burack, O. R., Weiner, A. S., Reinhardt, J. P., and Annunziato, R. A. (2012). What matters most to nursing home elders: quality of life in the nursing home. J. Am. Med. Dir. Assoc. 13, 48–53. doi: 10.1016/j.jamda.2010.08.002

PubMed Abstract | CrossRef Full Text | Google Scholar

Butler, R. N. (1969). Age-ism: another form of bigotry. Gerontologist 9, 243–246. doi: 10.1093/geront/9.4_Part_1.243

PubMed Abstract | CrossRef Full Text | Google Scholar

Campitelli, M. A., Bronskill, S. E., Maclagan, L. C., Harris, D. A., Cotton, C. A., Tadrous, M., et al. (2021). Comparison of medication prescribing before and after the COVID-19 pandemic among nursing home residents in Ontario, Canada. JAMA Netw. open 4, e2118441–e2118441. doi: 10.1001/jamanetworkopen.2021.18441

PubMed Abstract | CrossRef Full Text | Google Scholar

Cassie, K. M., and Cassie, W. (2013). Racial disparities in the use of physical restraints in US nursing homes. Health Soc. Work 38, 207–213. doi: 10.1093/hsw/hlt020

PubMed Abstract | CrossRef Full Text | Google Scholar

Castle, N. G., Hyer, K., Harris, J. A., and Engberg, J. (2020). Nurse aide retention in nursing homes. Gerontologist 60, 885–895. doi: 10.1093/geront/gnz168

PubMed Abstract | CrossRef Full Text | Google Scholar

CDC (2019). Long-term care providers and services users in the United States: Data from the National Study of Long-Term Care Providers, 2015–2016. Hyatsville, MD: Centers for Disease Control. Available online at: https://www.cdc.gov/nchs/data/series/sr_03/sr03_43-508.pdf. (accessed March 5, 2022).

Google Scholar

Chandonnet, M. (2021). 3 Technologies that are Revolutionizing Long-Term Care. Secure Care Products, LLC. Available online at: https://www.securecare.com/blog/3-technologies-that-are-revolutionizing-long-term-care (accessed March 3, 2022).

Google Scholar

Chang, E.-S., Kannoth, S., Levy, S., Wang, S.-Y., Lee, J. E., and Levy, B. R. (2020). Global reach of ageism on older persons' health: a systematic review. PloS One 15, e0220857. doi: 10.1371/journal.pone.0220857

PubMed Abstract | CrossRef Full Text | Google Scholar

Charness, N., and Boot, W. R. (2009). Aging and information technology use: potential and barriers. Curr. Dir. Psychol. Sci. 18, 253–258. doi: 10.1111/j.1467-8721.2009.01647.x

CrossRef Full Text | Google Scholar

Chisholm, L., Zhang, N. J., Hyer, K., Pradhan, R., Unruh, L., and Lin, F.-C. (2018). Culture change in nursing homes: what is the role of nursing home resources? Inquiry 55, 1–6. doi: 10.1177/0046958018787043

PubMed Abstract | CrossRef Full Text | Google Scholar

Chu, C. H., Nyrup, R., Leslie, K., Shi, J., Bianchi, A., Lyn, A., et al. (2022). Digital ageism: challenges and opportunities in artificial intelligence for older adults. Gerontologist 62, 947–955. doi: 10.1093/geront/gnab167

PubMed Abstract | CrossRef Full Text | Google Scholar

Cifor, M., Garcia, P., Cowan, T. L., Rault, J., Say Chan, A., Rode, J., et al. (2019). Feminist Data Manifest-No. Available online at: https://www.manifestno.com/ (accessed December 1, 2021).

Google Scholar

CIHI (2021a). COVID-19's Impact on Long-Term Care. Ottawa: Canadian Institute for Health Information. Available online at: https://www.cihi.ca/en/covid-19-resources/covid-19-data-collection-and-coding-direction (accessed March 1 2022).

Google Scholar

CIHI (2021b). The Impact of Covid-19 on Long-Term Care in Canada: Focus on the First 6 Months. Ottawa, ON. Available online at: https://www.cihi.ca/sites/default/files/document/impact-covid-19-long-term-care-canada-first-6-months-report-en.pdf (accessed June 1, 2021).

Google Scholar

Coghlan, S., Miller, T., and Paterson, J. (2021). Good proctor or “big brother”? Ethics of online exam supervision technologies. Phil. Technol. 34, 1581–1606. doi: 10.1007/s13347-021-00476-1

PubMed Abstract | CrossRef Full Text | Google Scholar

Cohen, L. (2013). What privacy is for. Har. Law Rev. 126, 1904–1933. Available online at: http://www.jstor.org/stable/23415061

Google Scholar

Corbyn, Z. (2021). The Future of Elder Care is here – and It's Artificial Intelligence. The Guardian. Available online at: https://www.theguardian.com/us-news/2021/jun/03/elder-care-artificial-intelligence-software (accessed June 15, 2021).

Google Scholar

Czaja, S. J., Sharit, J., Charness, N., Fisk, A. D., and Rogers, W. (2001). The Center for Research and Education on Aging and Technology Enhancement (CREATE): a program to enhance technology for older adults. Gerontechnology 1, 50–59. doi: 10.4017/gt.2001.01.01.005.00

CrossRef Full Text | Google Scholar

Dalmer, N., Ellison, K., Katz, S., and Marshall, B. (2022). Ageing, embodiment and datafication: dynamics of power in digital health and care technologies. Int. J. Ageing Later Life 15, 77–101. doi: 10.3384/ijal.1652-8670.3499

CrossRef Full Text | Google Scholar

Daly, T., and Szebehely, M. (2012). Unheard voices, unmapped terrain: care work in long-term residential care for older people in Canada and Sweden. Int. J. Soc. Welf. 21, 139–148. doi: 10.1111/j.1468-2397.2011.00806.x

PubMed Abstract | CrossRef Full Text | Google Scholar

Davis, L. (2003). Bending over Backwards: Disability, Dismodernism, and Other Difficult Positions. New York: New York University Press.

Google Scholar

de Boer, M. E., Hertogh, C. M., Dröes, R. M., Riphagen, I. I., Jonker, C., and Eefsting, J. A. (2007). Suffering from dementia - the patient's perspective: a review of the literature. Int. Psychogeriatrics 19, 1021–1039. doi: 10.1017/S1041610207005765

PubMed Abstract | CrossRef Full Text | Google Scholar

Diaz, M., Johnson, I., Lazar, A., Piper, A. M., and Gergle, D. (2018). “Addressing age-related bias in sentiment analysis,” in Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (Montreal QC, Canada: Association for Computing Machinery). doi: 10.1145/3173574.3173986

CrossRef Full Text | Google Scholar

Dupuis, S., McAiney, C. A., Fortune, D., Ploeg, J., and Witt, L. D. (2016). Theoretical foundations guiding culture change: the work of the partnerships in dementia care alliance. Dementia (London) 15, 85–105. doi: 10.1177/1471301213518935

PubMed Abstract | CrossRef Full Text | Google Scholar

Estabrooks, C. A., Straus, S. E., Flood, C. M., Keefe, J., Armstrong, P., Donner, G. J., et al. (2020). “Restoring trust: COVID-19 and the future of long-term care in Canada. Facets 5, 1–41. doi: 10.1139/facets-2020-0056

CrossRef Full Text | Google Scholar

Estes, C. L. (1979). The aging enterprise: in whose interests. Int. J. Health Serv. 16, 243–251. doi: 10.2190/XHXV-RTAV-RCFL-5JEN

PubMed Abstract | CrossRef Full Text | Google Scholar

Eubanks, V. (2018). Automating Inequality: How High-tech Tools Profile, Police, and Punish the Poor. New York, NY: St. Martin's Press.

Google Scholar

Eubanks, V., and Mateescu, A. (2021). We Don't Deserve this': New App Places us Caregivers Under Digital Surveillance. The Guardian U.S. Edition. Available online at: https://www.theguardian.com/us-news/2021/jul/28/digital-surveillance-caregivers-artificial-intelligence (accessed May 1, 2022).

Google Scholar

Fan, J., Ullal, A., Beuscher, L., Mion, L. C., Newhouse, P., and Sarkar, N. (2021). Field testing of Ro-Tri, a Robot-mediated triadic interaction for older adults. Int. J. Soc. Robot. 13, 1711–1727. doi: 10.1007/s12369-021-00760-2

PubMed Abstract | CrossRef Full Text | Google Scholar

Feng, Z., Fennell, M. L., Tyler, D. A., Clark, M., and Mor, V. (2011). The care span: growth of racial and ethnic minorities in US nursing homes driven by demographics and possible disparities in options. Health Aff. 30, 1358–1365. doi: 10.1377/hlthaff.2011.0126

PubMed Abstract | CrossRef Full Text | Google Scholar

Fennell, M. L., Feng, Z., Clark, M. A., and Mor, V. (2010). Elderly hispanics more likely to reside in poor-quality nursing homes. Health Aff. 29, 65–73. doi: 10.1377/hlthaff.2009.0003

PubMed Abstract | CrossRef Full Text | Google Scholar

Flanagan, A., Um, S., Sinha, S., and Roche, B. (2021). Leaving No One Behind In Long-Term Care: Enhancing Socio-Demographic Data Collection In Long-Term Care Settings. Toronto, ON: NIA. Available online at: https://static1.squarespace.com/static/5c2fa7b03917eed9b5a436d8/t/62012044f0158e27b04f2880/1644240964901/LeavingNoOneBehind.pdf (accessed April 20, 2021).

Google Scholar

FrameWorks Institute (2017). Framing Strategies to Advance Aging and Ageism as Policy Issues. Available online at: https://www.frameworksinstitute.org/wp-content/uploads/2020/05/aging_frame_brief.pdf (accessed December 2, 2021).

Google Scholar

Frik, A., Nurgalieva, L., Bernd, J., Lee, J., Schaub, F., and Egelman, S. (2019). “Privacy and security threat models and mitigation strategies of older adults,” in Fifteenth Symposium on Usable Privacy and Security (SOUPS 2019) (Santa Clara), 21–40.

Google Scholar

Gallistl, V., Seifert, A., and Kolland, F. (2021). COVID-19 as a “digital push?” Research experiences from long-term care and recommendations for the post-pandemic era. Front. Public Health 9, 531. doi: 10.3389/fpubh.2021.660064

PubMed Abstract | CrossRef Full Text | Google Scholar

Gallopyn, N., and Iezzoni, L. I. (2020). Views of electronic visit verification (EVV) among home-based personal assistance services consumers and workers. Disabil. Health J 13, 100938. doi: 10.1016/j.dhjo.2020.100938

PubMed Abstract | CrossRef Full Text | Google Scholar

Gandhi, A., Yu, H., and Grabowski, D. C. (2021). High nursing staff turnover in nursing homes offers important quality information: study examines high turnover of nursing staff at US nursing homes. Health Aff. 40, 384–391. doi: 10.1377/hlthaff.2020.00957

PubMed Abstract | CrossRef Full Text | Google Scholar

Gandy, O. H. (2010). Engaging rational discrimination: exploring reasons for placing regulatory constraints on decision support systems. Ethics Inform. Technol. 12, 29–42. doi: 10.1007/s10676-009-9198-6

CrossRef Full Text | Google Scholar

Garcia, P., Sutherland, T., Cifor, M., Chan, A. S., Klein, L., D'Ignazio, C., et al. (2020). No: Critical refusal as feminist data practice, in Companion Publication of the 2020 Conference on Computer Supported Cooperative Work and Social Computing, Virtual Event, USA. ACM, New York, NY, USA. doi: 10.1145/3406865.3419014

CrossRef Full Text | Google Scholar

Garg, V., Camp, L. J., Lorenzen-Huber, L., Shankar, K., and Connelly, K. (2014). Privacy concerns in assisted living technologies. Ann. Telecommun. 69, 75–88. doi: 10.1007/s12243-013-0397-0

CrossRef Full Text | Google Scholar

Garland-Thompson, R. (2006). “Integrating disability, transforming feminist theory,” in The Disability Studies Reader ed. L.J. Davis. (New York: Routledge) 257–274.

Google Scholar

Gary, M. E. (2021). Care robots, crises of capitalism, and the limits of human caring. Int. J. Fem. Approaches Bioeth. 14, 19–48. doi: 10.3138/ijfab-2020-07-28

CrossRef Full Text | Google Scholar

Glaser, A. L. (2021). Uberized care: employment status, surveillance, and technological erasure in the home health care sector. Anthropol. Work Rev. 42, 24–34. doi: 10.1111/awr.12215

CrossRef Full Text | Google Scholar

Government of Ontario (2021). News Release: Ontario Investing in Wearable Contact Tracing Technology to Help Protect Workers from COVID-19. Available online at: https://news.ontario.ca/en/release/60375/ontario-investing-in-wearable-contact-tracing-technology-to-help-protect-workers-from-covid-19 (accessed March 3, 2022).

Google Scholar

Grabowski, D. C. (2021). The future of long-term care requires investment in both facility- and home-based services. Nat. Aging 1, 10–11. doi: 10.1038/s43587-020-00018-y

CrossRef Full Text | Google Scholar

Graham, M. E. (2021). The securitisation of dementia: socialities of securitisation on secure dementia care units. Ageing Soc. 41, 439–455. doi: 10.1017/S0144686X19001247

CrossRef Full Text | Google Scholar

Gran, A.-B., Booth, P., and Bucher, T. (2021). To be or not to be algorithm aware: a question of a new digital divide? Infor. Commun. Soc. 24, 1779–1796. doi: 10.1080/1369118X.2020.1736124

CrossRef Full Text | Google Scholar

Green, B. (2021). The contestation of tech ethics: a sociotechnical approach to ethics and technology in action. J. Soc. Comput. 2, 209–225. doi: 10.23919/JSC.2021.0018

CrossRef Full Text | Google Scholar

Greene, D., Hoffmann, A. L., and Stark, L. (2019). “Better, nicer, clearer, fairer: a critical assessment of the movement for ethical artificial intelligence and machine learning,” in Proceedings of the 52nd Hawaii international Conference on System Sciences. (Maui). doi: 10.24251/HICSS.2019.258

CrossRef Full Text | Google Scholar

Grigorovich, A., and Kontos, P. (2020). Towards responsible implementation of monitoring technologies in institutional care. Gerontologist 60, 1194–1201. doi: 10.1093/geront/gnz190

PubMed Abstract | CrossRef Full Text | Google Scholar

Grigorovich, A., Kontos, P., and Kontos, A. P. (2019). The “violent resident”: a critical exploration of the ethics of resident-to-resident aggression. J. Bioeth. Inq. 16, 173–183. doi: 10.1007/s11673-019-09898-1

PubMed Abstract | CrossRef Full Text | Google Scholar

Grigorovich, A., Kulandaivelu, Y., Newman, K., Bianchi, A., Khan, S. S., Iaboni, A., et al. (2021). Factors affecting the implementation, use, and adoption of real-time location system technology for persons living with cognitive disabilities in long-term care homes: systematic review. JMIR 23, e22831. doi: 10.2196/22831

PubMed Abstract | CrossRef Full Text | Google Scholar

Hall, A., Brown Wilson, C., Stanmore, E., and Todd, C. (2019). Moving beyond 'safety' versus 'autonomy': a qualitative exploration of the ethics of using monitoring technologies in long-term dementia care. BMC Geriat. 19, 1–13. doi: 10.1186/s12877-019-1155-6

PubMed Abstract | CrossRef Full Text | Google Scholar

Halpern, J., Paolo, D., and Huang, A. (2019). Informed consent for early-phase clinical trials: therapeutic misestimation, unrealistic optimism and appreciation. J. Med. Ethics 45, 384–387. doi: 10.1136/medethics-2018-105226

PubMed Abstract | CrossRef Full Text | Google Scholar

Hamraie, A., and Fritsch, K. (2019). Crip technoscience manifesto. Catalyst 5, 1–33. doi: 10.28968/cftt.v5i1.29607

CrossRef Full Text | Google Scholar

Harman, G., and Clare, L. (2006). Illness representations and lived experience in early-stage dementia. Qual. Health Res. 16, 484–502. doi: 10.1177/1049732306286851

PubMed Abstract | CrossRef Full Text | Google Scholar

Harrington, C., Mollot, R., Edelman, T. S., Wells, J., and Valanejad, D. (2019). U.S. nursing home violations of international and domestic human rights standards. Int. J. Health Serv. 50, 62–72. doi: 10.1177/0020731419886196

PubMed Abstract | CrossRef Full Text | Google Scholar

Heckman, G. A., Kay, K., Morrison, A., Grabowski, D. C., Hirdes, J. P., Mor, V., et al. (2021). Proceedings from an international virtual townhall: reflecting on the COVID-19 pandemic: themes from long-term care. J. Am. Med. Dir. Assoc. 22, 1128–1132. doi: 10.1016/j.jamda.2021.03.029

PubMed Abstract | CrossRef Full Text | Google Scholar

Herron, R., Kelly, C., and Aubrecht, K. (2021). A Conversation about ageism: time to deinstitutionalize long-term care? Univ. Tor. Q. 90, 183–206. doi: 10.3138/utq.90.2.09

CrossRef Full Text | Google Scholar

Higgs, P., and Gilleard, C. (2014). Frailty, abjection and the ‘othering' of the fourth age. Health Sociol. Rev. 23, 10–19. doi: 10.5172/hesr.2014.23.1.10

CrossRef Full Text | Google Scholar

Higgs, P., and Gilleard, C. (2021). Techno-fixes for an ageing society. Aging Ment. Health. 26, 1303–1305. doi: 10.1080/13607863.2021.2008308

PubMed Abstract | CrossRef Full Text | Google Scholar

Ho, A. (2020). Are we ready for artificial intelligence health monitoring in elder care? BMC Geriatr. 20, 358. doi: 10.1186/s12877-020-01764-9

PubMed Abstract | CrossRef Full Text | Google Scholar

Hoffmann, A. L. (2018). Data Violence and How Bad Engineering Choices can Damage Society. Available online at: https://medium.com/s/story/data-violence-and-how-bad-engineering-choices-can-damage-society-39e44150e1d4 (accessed November 1, 2021).

Google Scholar

Hoffmann, A. L. (2019). Where fairness fails: data, algorithms, and the limits of antidiscrimination discourse. Inf. Commun. Soc. 22, 900–915. doi: 10.1080/1369118X.2019.1573912

CrossRef Full Text | Google Scholar

Hoffmann, A. L. (2020a). Data Ethics for Non-Ideal Times: Some Notes on the Course. Available online at: https://static1.squarespace.com/static/5b8ab61f697a983fd6b04c38/t/5f74b0d450b24a77b4db1996/1601482964607/Hoffmann$+$-$+$Data$+$Ethics$+$for$+$Non-Ideal$+$Times$+$Lecture$+$Notes.pdf (accessed March 2, 2022).

Google Scholar

Hoffmann, A. L. (2020b). Terms of inclusion: data, discourse, violence. New Media Soc. 23, 3539–3556. doi: 10.1177/1461444820958725

CrossRef Full Text | Google Scholar

Hoffmann, A. L. (2021). Even when you are a solution you are a problem: an uncomfortable reflection on feminist data ethics. Glob. Perspect. 2, 1–5. doi: 10.1525/gp.2021.21335

CrossRef Full Text | Google Scholar

Hopkins, A. (2018). How this 'Anti-Fraud' Device Violates the Rights of People with Disabilities. The Mighty. Available online at: https://themighty.com/2018/03/electronic-visit-verification-violates-rights-people-disabilities/ (accessed April 1, 2022).

Google Scholar

Husebo, B. S., Heintz, H. L., Berge, L. I., Owoyemi, P., Rahman, A. T., and Vahia, I. V. (2020). Sensing technology to monitor behavioral and psychological symptoms and to assess treatment response in people with dementia. A systematic review. Front. Pharmacol. 10, 1699. doi: 10.3389/fphar.2019.01699

PubMed Abstract | CrossRef Full Text | Google Scholar

Ienca, M., Wangmo, T., Jotterand, F., Kressig, R. W., and Elger, B. (2018). Ethical design of intelligent assistive technologies for dementia: a descriptive review. Sci. Eng. Ethics 24, 1035–1055. doi: 10.1007/s11948-017-9976-1

PubMed Abstract | CrossRef Full Text | Google Scholar

Jenkins Morales, M., Miller, V. J., and Hamler, T. (2022). Dismantling systemic racism in long-term services and supports: a call to action for social workers. J. Gerontol. Soc. Work 65, 121–128. doi: 10.1080/01634372.2021.1942375

PubMed Abstract | CrossRef Full Text | Google Scholar

Jenkins Morales, M., and Robert, S. A. (2020). Black–white disparities in moves to assisted living and nursing homes among older medicare beneficiaries. J. Gerontol. B 75, 1972–1982. doi: 10.1093/geronb/gbz141

PubMed Abstract | CrossRef Full Text | Google Scholar

Joyce, K., and Loe, M. (2010). A sociological approach to ageing, technology and health. Sociol. Health Illn. 32, 171–180. doi: 10.1111/j.1467-9566.2009.01219.x

PubMed Abstract | CrossRef Full Text | Google Scholar

Katz, S., and Calasanti, T. (2015). Critical perspectives on successful aging: Does it “appeal more than it illuminates”? Gerontologist 55, 26–33. doi: 10.1093/geront/gnu027

PubMed Abstract | CrossRef Full Text | Google Scholar

Katz, S., and Marshall, B. L. (2004). Is the functional ‘normal'? Aging, sexuality and the bio-marking of successful living. Hist. Human Sci. 17, 53–75. doi: 10.1177/0952695104043584

CrossRef Full Text | Google Scholar

Katz, S., and Marshall, B. L. (2018). Tracked and fit: FitBits, brain games, and the quantified aging body. J. Aging Stud. 45, 63–68. doi: 10.1016/j.jaging.2018.01.009

PubMed Abstract | CrossRef Full Text | Google Scholar

Keng, A., Brown, E. E., Rostas, A., Rajji, T. K., Pollock, B. G., Mulsant, B. H., et al. (2020). Effectively caring for individuals with behavioral and psychological symptoms of dementia during the COVID-19 pandemic. Front. Psychiatry 11, 573367. doi: 10.3389/fpsyt.2020.573367

PubMed Abstract | CrossRef Full Text | Google Scholar

Kenner, A. M. (2008). Securing the elderly body: dementia, surveillance, and the politics of “Aging in Place”. Surveill. Soc. 5, 1–18. doi: 10.24908/ss.v5i3.3423

CrossRef Full Text | Google Scholar

Khan, S. S., Mishra, P. K., Javed, N., Ye, B., Newman, K., Mihailidis, A., et al. (2022). Unsupervised deep learning to detect agitation from videos in people with dementia. IEEE Access 10, 10349–10358. doi: 10.1109/ACCESS.2022.3143990

CrossRef Full Text | Google Scholar

Knochel, K. A., and Flunker, D. (2021). Long-term care expectations and plans of transgender and nonbinary older adults. J App. Gerontol. 40, 1542–1550. doi: 10.1177/0733464821992919

PubMed Abstract | CrossRef Full Text | Google Scholar

Konetzka, R. T., White, E. M., Pralea, A., Grabowski, D. C., and Mor, V. (2021). A systematic review of long-term care facility characteristics associated with COVID-19 outcomes. J. Am. Geriatr. Soc. 69, 2766–2777. doi: 10.1111/jgs.17434

PubMed Abstract | CrossRef Full Text | Google Scholar

Koops, B.-J., Newell, B. C., Timan, T., Skorvanek, I., Chokrevski, T., and Galic, M. (2016). A typology of privacy. Uni. Penn. J. Int. Law 38, 483. Available online at: https://scholarship.law.upenn.edu/cgi/viewcontent.cgi?article=1938andcontext=jil

Google Scholar

Koren, M. J. (2010). Person-centered care for nursing home residents: the culture-change movement. Health Aff. 29, 312–317. doi: 10.1377/hlthaff.2009.0966

PubMed Abstract | CrossRef Full Text | Google Scholar

Lageson, S. E., McElrath, S., and Palmer, K. E. (2019). Gendered public support for criminalizing “revenge porn”. Fem. Criminol. 14, 560–583. doi: 10.1177/1557085118773398

CrossRef Full Text | Google Scholar

Laroche, J. (2020). Tracking Device Reassures Son His Father Will be Safe in Assisted-Living Facility. CBC News. Available online at: https://www.cbc.ca/news/canada/nova-scotia/nursing-home-covid-tracking-technology-tenera-care-1.5806795 (accessed March 1, 2022).

Google Scholar

Levy, K., Kilgour, L., and Berridge, C. (2018). Regulating privacy in public/private space: the case of nursing home monitoring Laws. Elder Law J. 26, 323–365. Available online at: https://theelderlawjournal.com/wp-content/uploads/2019/02/Levy.pdf

Google Scholar

Li, Y., Harrington, C., Mukamel, D. B., Cen, X., Cai, X., and Temkin-Greener, H. (2015). Nurse staffing hours at nursing homes with high concentrations of minority residents, 2001-11. Health Aff. 34, 2129–2137. doi: 10.1377/hlthaff.2015.0422

PubMed Abstract | CrossRef Full Text | Google Scholar

Lukkien, D. R. M., Nap, H. H., Buimer, H. P., Peine, A., Boon, W. P. C., Ket, J. C. F., et al. (2021). Toward responsible artificial intelligence in long-term care: a scoping review on practical approaches. Gerontologist. doi: 10.1093/geront/gnab180. [Epub ahead of print].

PubMed Abstract | CrossRef Full Text | Google Scholar

Mack, D. S., Hunnicutt, J. N., Jesdale, B. M., and Lapane, K. L. (2018). Non-Hispanic Black-white disparities in pain and pain management among newly admitted nursing home residents with cancer. J. Pain Res. 11, 753–761. doi: 10.2147/JPR.S158128

PubMed Abstract | CrossRef Full Text | Google Scholar

Malik, H. M., Viljanen, M., Lepinkäinen, N., and Alvesalo-Kuusi, A. (2022). Dynamics of social harms in an algorithmic context. Int. J. Crime. Justice. Soc. Democracy. 11, 182–195. doi: 10.5204/ijcjsd.2141

CrossRef Full Text | Google Scholar

Manor, S., and Herscovici, A. (2021). Digital ageism: a new kind of discrimination. Hum. Beh. Emerg. Technol. 3, 1084–1093. doi: 10.1002/hbe2.299

CrossRef Full Text | Google Scholar

Mateescu, A. (2021). Electronic Visit Verification: The Weight of Surveillance and the Fracturing of Care. Data and Society. Available online at: https://datasociety.net/library/electronic-visit-verification-the-weight-of-surveillance-and-the-fracturing-of-care/ (accessed March 2, 2022).

Google Scholar

Mauldin, L. (2019). The Care Crisis Isn't What You Think: Our Problems are Deeper than a Lack of Care Infrastructure. The American Prospect. Available online at: https://prospect.org/health/disability-care-crisis-isnt-what-you-think/ (accessed May 2, 2022).

Google Scholar

McDonald, P., Williams, P., and Mayes, R. (2021). Means of control in the organization of digitally intermediated care work. Work. Employ. Soc. 35, 872–890. doi: 10.1177/0950017020969107

CrossRef Full Text | Google Scholar

Menne, H. L., and Whitlatch, C. J. (2007). Decision-making involvement of individuals with dementia. Gerontologist 47, 810–819. doi: 10.1093/geront/47.6.810

PubMed Abstract | CrossRef Full Text | Google Scholar

Messing, J., Bagwell-Gray, M., Brown, M. L., Kappas, A., and Durfee, A. (2020). Intersections of stalking and technology-based abuse: emerging definitions, conceptualization, and measurement. J. Fam. Viol. 35, 693–704. doi: 10.1007/s10896-019-00114-7

CrossRef Full Text | Google Scholar

Metcalf, J. (2018). When Verification is also Surveillance. Data and Society: Points. Available online at: https://points.datasociety.net/when-verification-is-also-surveillance-21edb6c12cc9 (accessed June 1, 2022).

Google Scholar

Miceli, M., Posada, J., and Yang, T. (2022). Studying up machine learning data: why talk about bias when we mean power? Proc. ACM on Hum. Comp. Interact. 6, 1–14. doi: 10.1145/3492853

CrossRef Full Text | Google Scholar

Miller, D. (2021). Smartphones and Contact-Tracing: Balancing Care and Surveillance. The Conversation. Available online at: https://theconversation.com/smartphones-and-contact-tracing-balancing-care-and-surveillance-161212

Google Scholar

Miller, L. M., Whitlatch, C. J., and Lyons, K. S. (2016). Shared decision-making in dementia: a review of patient and family carer involvement. Dementia (London) 15, 1141–1157. doi: 10.1177/1471301214555542

PubMed Abstract | CrossRef Full Text | Google Scholar

Mitchell, G., Dupuis, S. L., Kontos, P., Jonas-Simpson, C., and Gray, J. (2020). Disrupting dehumanising and intersecting patterns of modernity with a relational ethic of caring. Int. Pract. Dev. J. 10, 1–15. doi: 10.19043/ipdj.101.002

CrossRef Full Text | Google Scholar

Mor, V., Zinn, J., Angelelli, J., Teno, J. M., and Miller, S. C. (2004). Driven to tiers: socioeconomic and racial disparities in the quality of nursing home care. Milbank Q. 82, 227–256. doi: 10.1111/j.0887-378X.2004.00309.x

PubMed Abstract | CrossRef Full Text | Google Scholar

Mor, V., Zinn, J., Gozalo, P., Feng, Z., Intrator, O., and Grabowski, D. C. (2007). Prospects for transferring nursing home residents to the community. Health Aff. 26, 1762–1771. doi: 10.1377/hlthaff.26.6.1762

PubMed Abstract | CrossRef Full Text | Google Scholar

Mort, M., Roberts, C., and Callén, B. (2013). Ageing with telecare: care or coercion in austerity? Sociol. Health Illn. 35, 799–812. doi: 10.1111/j.1467-9566.2012.01530.x

PubMed Abstract | CrossRef Full Text | Google Scholar

Moss, E., Watkins, E., Singh, R., Elish, M. C., and Metcalf, J. (2021). Assembling Accountability: Algorithmic Impact Assessment for the Public Interest. Data and Society. Available online at: https://datasociety.net/wp-content/uploads/2021/06/Assembling-Accountability.pdf. (accessed July 27, 2022). doi: 10.2139/ssrn.3877437

CrossRef Full Text | Google Scholar

Moyle, W., Jones, C., Murfield, J., and Liu, F. (2020). ‘For me at 90, it's going to be difficult': feasibility of using iPad video-conferencing with older adults in long-term aged care. Aging Ment. Health 24, 349–352. doi: 10.1080/13607863.2018.1525605

PubMed Abstract | CrossRef Full Text | Google Scholar

Murray, T., and Friedmann, J. (2020). Report. Nursing Home Safety During COVID: PPE Shortages. U.S. PIRG Education Fund and Frontier Group. Available online at: https://uspirg.org/feature/usp/nursing-home-safety-during-covid-ppe-shortages (accessed May 1, 2022).

Google Scholar

NAS (2022). The National Imperative to Improve Nursing Home Quality: Honoring Our Commitment to Residents, Families, and Staff . Washington, DC: The National Academies Press.

Google Scholar

Neven, L. (2015). By any means? Questioning the link between gerontechnological innovation and older people's wish to live at home. Technol. Forecast. Soc. Change 93, 32–43. doi: 10.1016/j.techfore.2014.04.016

CrossRef Full Text | Google Scholar

Neven, L., and Peine, A. (2017). From triple win to triple sin: how a problematic future discourse is shaping the way people age with technology. Societies 7, 1–11. doi: 10.3390/soc7030026

CrossRef Full Text | Google Scholar

NIA (2021). During the COVID-19 Pandemic Should I or My Family Member Go to Live with Family or Stay in the Long-Term Care or Nursing Home?. Available online at: https://static1.squarespace.com/static/5c2fa7b03917eed9b5a436d8/t/5e9e479d00ac2a433dc22a14/1587431327742/During+the+COVID-19+pandemic%2C+should+I+or+my+family+member+go.pdf (accessed May 1, 2022).

Google Scholar

Niemeijer, A. R., Depla, M. F., Frederiks, B. J. M., and Hertogh, C. M. P. M. (2015). The experiences of people with dementia and intellectual disabilities with surveillance technologies in residential care. Nurs. Ethics 22, 307–320. doi: 10.1177/0969733014533237

PubMed Abstract | CrossRef Full Text | Google Scholar

Nissenbaum, H. (2009). Privacy in Context. Stanford: Stanford University Press. doi: 10.1515/9780804772891

CrossRef Full Text | Google Scholar

Noble, S. U. (2018). Algorithms of Oppression. New York: New York University Press. doi: 10.2307/j.ctt1pwt9w5

CrossRef Full Text | Google Scholar

Nordgren, A. (2018). How to respond to resistiveness towards assistive technologies among persons with dementia. Med. Health Care Philos. 21, 411–421. doi: 10.1007/s11019-017-9816-8

PubMed Abstract | CrossRef Full Text | Google Scholar

Obermeyer, Z., Powers, B., Vogeli, C., and Mullainathan, S. (2019). Dissecting racial bias in an algorithm used to manage the health of populations. Science 366, 447–453. doi: 10.1126/science.aax2342

PubMed Abstract | CrossRef Full Text | Google Scholar

OLTCA (2019). This is Long-Term Care. Available online at: https://www.oltca.com/OLTCA/Documents/Reports/TILTC2019web.pdf (accessed June 1, 2022).

Google Scholar

Ontario Health Coalition (2022). Media Release: Emergency Measures Needed to Address the Health Care Staffing Crisis. Available online at: https://www.ontariohealthcoalition.ca/index.php/release-emergency-measures-needed-to-address-the-health-care-staffing-crisis/ (accessed May 1, 2022).

Google Scholar

Orlov, L. (2021). AI Technology Matters in the Care of Older Adults. Aging and Health Technology Watch. Available online at: https://www.ageinplacetech.com/blog/ai-technology-matters-care-older-adults (accessed March 3, 2022).

Google Scholar

Osman, L., and Woolf, M. (2022). Some Nurses Lack Proper PPE Amid Omicron COVID-19 Surge, Union Says. Global News. Available online at: https://globalnews.ca/news/8510939/nurses-ppe-omicron-covid-19/ (accessed May 1 2022).

Google Scholar

Oude Weernink, C. E., Felix, E., Verkuijlen, P. J. E. M., Dierick-van Daele, A. T. M., Kazak, J. K., and Van Hoof, J. (2018). Real-time location systems in nursing homes: state of the art and future applications. J. Enabling Technol. 12, 45–56. doi: 10.1108/JET-11-2017-0046

CrossRef Full Text | Google Scholar

Paulin, E. (2021). COVID-19 Deaths in Nursing Homes Plummet, Staff and PPE Shortages Persist. AARP. Available online at: https://www.aarp.org/caregiving/health/info-2021/nursing-home-covid-deaths-down-shortages-continue.html (accessed June 25, 2022).

Google Scholar

Peine, A., and Neven, L. (2018). From intervention to co-constitution: new directions in theorizing about aging and technology. Gerontologist 59, 15–21. doi: 10.1093/geront/gny050

PubMed Abstract | CrossRef Full Text | Google Scholar

Peine, A., and Neven, L. (2021). The co-constitution of ageing and technology: a model and agenda. Ageing Soc. 41, 2845–2866. doi: 10.1017/S0144686X20000641

CrossRef Full Text | Google Scholar

PHI (2019). Brief. U.S. Nursing Assistants Employed in Nursing Homes: Key Facts. Available online at: https://www.phinational.org/resource/u-s-nursing-assistants-employed-in-nursing-homes-key-facts-2019/ (accessed May 2, 2022).

Google Scholar

Poulsen, A., Fosch-Villaronga, E., and Burmeister, O. K. (2020). Cybersecurity, value sensing robots for LGBTIQ+ elderly, and the need for revised codes of conduct. Australas. J. Inform. Syst. 24, 1–16. doi: 10.3127/ajis.v24i0.2789

CrossRef Full Text | Google Scholar

Redden, J., and Brand, J. (2020). Data Harm Record (Updated). Data Justice Lab. Available online at: https://datajusticelab.org/data-harm-record/ (accessed June 1, 2022).

Google Scholar

Robinson-Lane, S. G., Block, L., Bowers, B. J., Cacchione, P. Z., and Gilmore-Bykovskyi, A. (2022). The intersections of structural racism and ageism in the time of COVID-19: a call to action for gerontological nursing science. Res. Gerontol. Nurs. 15, 6–13. doi: 10.3928/19404921-20211209-03

PubMed Abstract | CrossRef Full Text | Google Scholar

Rogers, W. A., and Fisk, A. D. (2010). Toward a psychological science of advanced technology design for older adults. J. Gerontology B 65B, 645–653. doi: 10.1093/geronb/gbq065

PubMed Abstract | CrossRef Full Text | Google Scholar

Rosales, A., and Fernández-Ardèvol, M. (2020). Ageism in the era of digital platforms. Convergence 26, 1074–1087. doi: 10.1177/1354856520930905

PubMed Abstract | CrossRef Full Text | Google Scholar

Scales, K. (2021). It is time to resolve the direct care workforce crisis in long-term care. Gerontologist 61, 497–504. doi: 10.1093/geront/gnaa116

PubMed Abstract | CrossRef Full Text | Google Scholar

Scales, K. (2022). Transforming direct care jobs, reimagining long-term services and supports. J. Am. Med. Dir. Assoc. 23, 207–213. doi: 10.1016/j.jamda.2021.12.005

PubMed Abstract | CrossRef Full Text | Google Scholar

Scalia, K. (2019). Electronic Visit Verification (EVV) Is Here: What You Need to Know and How To Get Involved. Available online at: https://disabilityvisibilityproject.com/2019/03/24/electronic-visit-verification-evv-is-here (accessed May 1, 2022).

Google Scholar

Seberger, J. S., and Patil, S. (2021). Post-COVID public health surveillance and privacy expectations in the United States: scenario-based interview study. JMIR Mhealth Uhealth 9, e30871. doi: 10.2196/30871

PubMed Abstract | CrossRef Full Text | Google Scholar

Shew, A. (2020). Ableism, technoableism, and future AI. IEEE Technol. Soc. 39, 40–85. doi: 10.1109/MTS.2020.2967492

CrossRef Full Text | Google Scholar

Shippee, T. P., Ng, W., and Bowblis, J. R. (2020). Does living in a higher proportion minority facility improve quality of life for racial/ethnic minority residents in nursing homes? Inn. Aging 4, igaa014. doi: 10.1093/geroni/igaa014

PubMed Abstract | CrossRef Full Text | Google Scholar

Shore, K. (2021). Targeting vulnerability with electronic location monitoring: paternalistic surveillance and the distortion of risk as a mode of carceral expansion. Crit. Criminol. 29, 75–92. doi: 10.1007/s10612-021-09558-0

PubMed Abstract | CrossRef Full Text | Google Scholar

Singleton, D. (2018). Study Shows Most LGBT Adults Worry About Discrimination in Senior Care Housing. Diverse Elders. Available online at: https://www.diverseelders.org/2018/06/28/study-shows-most-lgbt-adults-worry-about-discrimination-in-senior-care-housing/ (accessed April 1, 2022).

Google Scholar

Sloane, P. D., Yearby, R., Konetzka, R. T., Li, Y., Espinoza, R., and Zimmerman, S. (2021). Addressing systemic racism in nursing homes: a time for action. J. Am. Med. Dir. Assoc. 22, 886–892. doi: 10.1016/j.jamda.2021.02.023

PubMed Abstract | CrossRef Full Text | Google Scholar

Smith, D. B., Feng, Z., Fennell, M. L., Zinn, J., and Mor, V. (2008). Racial disparities in access to long-term care: the illusive pursuit of equity. J. Health Pol. Policy Law 33, 861–881. doi: 10.1215/03616878-2008-022

PubMed Abstract | CrossRef Full Text | Google Scholar

Smuha, N. A. (2021). Beyond the individual: governing AI's societal harm. Internet Policy Rev. 10, 1–32. doi: 10.14763/2021.3.1574

CrossRef Full Text | Google Scholar

Sojourner, A. J., Grabowski, D. C., Chen, M., and Town, R. J. (2010). Trends in unionization of nursing homes. Inquiry 47, 331–342. doi: 10.5034/inquiryjrnl_47.04.331

CrossRef Full Text | Google Scholar

Sousa, I. (2013). “New technologies and concepts of care,” in Troubling Care: Critical Perspectives on Research and Practices, eds P. Armstrong and S. Braedley (Toronto, ON: Canadian Scholars' Press), 129–142.

PubMed Abstract | Google Scholar

Steele, L., Carr, R., Swaffer, K., Phillipson, L., and Fleming, R. (2020). Human rights and the confinement of people living with dementia in care homes. Health Hum. Rights 22, 7. Available online at: https://www.hhrjournal.org/2020/06/human-rights-and-the-confinement-of-people-living-with-dementia-in-care-homes/

PubMed Abstract | Google Scholar

Stevenson, M., Savage, B., and Taylor, B. J. (2019). Perception and communication of risk in decision making by persons with dementia. Dementia (London) 18, 1108–1127. doi: 10.1177/1471301217704119

PubMed Abstract | CrossRef Full Text | Google Scholar

Stypińska, J. (2021). “Ageism in AI: new forms of age discrimination in the era of algorithms and artificial intelligence”, in International Conference on AI for People: Towards Sustainable AI. Bologna, Italy. doi: 10.4108/eai.20-11-2021.2314200

CrossRef Full Text | Google Scholar

Taati, B., Zhao, S., Ashraf, A. B., Asgarian, A., Browne, M. E., Prkachin, K. M., et al. (2019). Algorithmic bias in clinical populations: evaluating and improving facial analysis technology in older adults with dementia. IEEE Access 7, 25527–25534. doi: 10.1109/ACCESS.2019.2900022

CrossRef Full Text | Google Scholar

The COVID Tracking Project (2022). Long-Term-Care COVID Tracker. Available online at: https://covidtracking.com/nursing-homes-long-term-care-facilities (accessed May 1, 2022).

Google Scholar

The New York Times (2021). Nearly One-Third of U.S. Coronavirus Deaths are Linked to Nursing Homes. Available online at: https://www.nytimes.com/interactive/2020/us/coronavirus-nursing-homes.html (accessed April 1, 2022).

Google Scholar

Thomas, K. S., and Mor, V. (2013). Providing more home-delivered meals is one way to keep older adults with low care needs out of nursing homes. Health Aff. 32, 1796–1802. doi: 10.1377/hlthaff.2013.0390

PubMed Abstract | CrossRef Full Text | Google Scholar

Timmons, S., Vezyridis, P., and Sahota, O. (2019). Trialling technologies to reduce hospital in-patient falls: an agential realist analysis. Sociol. Health Illn. 41, 1104–1119. doi: 10.1111/1467-9566.12889

PubMed Abstract | CrossRef Full Text | Google Scholar

Tremain, S. L. (2020). COVID-19 and the Naturalization of Vulnerability. Biopolitical Philosophy. Available online at: https://biopoliticalphilosophy.com/2020/04/01/covid-19-and-the-naturalization-of-vulnerability (accessed November 1, 2021).

Google Scholar

Tremain, S. L. (2021). Why Nursing-Home Incarceration Must End. Biopolitical Philosophy. Available online at: https://biopoliticalphilosophy.com/2021/04/30/why-nursing-home-incarceration-must-end/comment-page-1/ (accessed May 14, 2022).

Google Scholar

Trewin, S. (2018). AI Fairness for People With Disabilities. Available online at: https://arxiv.org/abs/1811.10670 (accessed November, 2021).

Google Scholar

Tufford, F., Lowndes, R., Struthers, J., and Chivers, S. (2017). ‘Call security': locks, risk, privacy and autonomy in long-term residential care. Ageing Int. 43, 34–52. doi: 10.1007/s,12126-017-9289-3

CrossRef Full Text | Google Scholar

U. S. Census Bureau (2018). ACS 2017 1-year Estimates, Characteristics of the Group Quarters Population by Group Quarters Type (3 Types). Available online at: https://factfinder.census.gov/bkmk/table/1.0/en/ACS/17_1YR/S2602 (accessed July 18, 2019).

Google Scholar

van Doorn, N. (2017). Platform labor: on the gendered and racialized exploitation of low-income service work in the ‘on-demand' economy. Inform. Commun. Soc. 20, 898–914. doi: 10.1080/1369118X.2017.1294194

CrossRef Full Text | Google Scholar

Vermeer, Y., Higgs, P., and Charlesworth, G. (2019). What do we require from surveillance technology? A review of the needs of people with dementia and informal caregivers. J. Rehabil. Assist. Technol. Eng. 6, 2055668319869517. doi: 10.1177/2055668319869517

PubMed Abstract | CrossRef Full Text | Google Scholar

Viljoen, S. (2021). Data Relations. Logic Magazine. Available online at: https://logicmag.io/distribution/data-relations/ (accessed May 22, 2022).

Google Scholar

von Gerich, H., Moen, H., Block, L. J., Chu, C. H., DeForest, H., Hobensack, M., et al. (2022). Artificial Intelligence -based technologies in nursing: a scoping literature review of the evidence. Int. J. Nurs. Stud. 127, 1–20. doi: 10.1016/j.ijnurstu.2021.104153

PubMed Abstract | CrossRef Full Text | Google Scholar

Wagner, L. M., Bates, T., and Spetz, J. (2021). The association of race, ethnicity, and wages among registered nurses in long-term care. Med. Care 59, S479–S485. doi: 10.1097/MLR.0000000000001618

PubMed Abstract | CrossRef Full Text | Google Scholar

Werner, R. M., Hoffman, A. K., and Coe, N. B. (2020). Long-term care policy after Covid-19: Solving the nursing home crisis. N. Engl. J. Med. 383, 903–905. doi: 10.1056/NEJMp2014811

PubMed Abstract | CrossRef Full Text | Google Scholar

Wetsman, N. (2020). Senior Sensors: Digital Contact Tracing Wasn't Up for Debate in Senior Living Facilities. The Verge. Available online at: https://www.theverge.com/21509117/contact-tracing-apps-digital-senior-nursing-homes (accessed January 1, 2022).

Google Scholar

Whitlatch, C. J., Feinberg, L. F., and Tucke, S. S. (2005). Measuring the values and preferences for everyday care of persons with cognitive impairment and their family caregivers. Gerontologist 45, 370–380. doi: 10.1093/geront/45.3.370

PubMed Abstract | CrossRef Full Text | Google Scholar

Whittaker, M., Alper, M., Bennett, C. L., Hendren, S., Kaziunas, L., and Mills, M. (2019). Disability, Bias, and AI. AI Now Institute. Available online at: https://ainowinstitute.org/disabilitybiasai-2019.pdf (accessed December 1, 2021).

Google Scholar

Wigg, J. M. (2010). Liberating the wanderers: using technology to unlock doors for those living with dementia. Sociol. Health Illn. 32, 288–303. doi: 10.1111/j.1467-9566.2009.01221.x

PubMed Abstract | CrossRef Full Text | Google Scholar

Williams, P., McDonald, P., and Mayes, R. (2021). Recruitment in the gig economy: attraction and selection on digital platforms. Int. J. Hum. Resour. Manage. 32, 4136–4162. doi: 10.1080/09585192.2020.1867613

CrossRef Full Text | Google Scholar

Williams, R. M. (2019). Metaeugenics and metaresistance: from manufacturing the ‘includeable body'to walking away from the broom closet. Can. J. Child. Rights 6, 60–77. doi: 10.22215/cjcr.v6i1.1976

CrossRef Full Text | Google Scholar

Williams, R. M., and Gilbert, J. E. (2020). Perseverations of the academy: a survey of wearable technologies applied to autism intervention. Int. J. Hum. Comput. Stud. 143, 102485. doi: 10.1016/j.ijhcs.2020.102485

CrossRef Full Text | Google Scholar

Wilson, K., Kortes-Miller, K., and Stinchcombe, A. (2018). Staying out of the closet: LGBT older adults' hopes and fears in considering end-of-life. Can. J. Aging 37, 22–31. doi: 10.1017/S0714980817000514

PubMed Abstract | CrossRef Full Text | Google Scholar

Wojtusiak, J., Asadzadehzanjani, N., Levy, C., Alemi, F., and Williams, A. E. (2021). Computational Barthel Index: an automated tool for assessing and predicting activities of daily living among nursing home patients. BMC Med. Inform. Decis. Mak. 21, 17. doi: 10.1186/s12911-020-01368-8

PubMed Abstract | CrossRef Full Text | Google Scholar

Wood, A. J. (2021). Algorithmic Management Consequences for Work Organisation and Working Conditions. JRC Working Papers Series on Labour, Education and Technology. Available online at: https://joint-research-centre.ec.europa.eu/publications/algorithmic-management-consequences-work-organisation-and-working-conditions_en

Google Scholar

World Health Organization (2022). Ageism in Artificial Intelligence for Health. Geneva. Available online at: https://www.who.int/publications/i/item/9789240040793 (accessed April 20, 2022).

Google Scholar

Zagrodney, K., and Saks, M. (2017). Personal support workers in Canada: the new precariat? Healthc. Policy 13, 31–39. doi: 10.12927/hcpol.2017.25324

PubMed Abstract | CrossRef Full Text | Google Scholar

Zhu, Y., Song, T., Zhang, Z., Deng, C., Alkhalaf, M., Li, W., et al. (2022). Agitation prevalence in people with dementia in Australian residential aged care facilities: findings from machine learning of electronic health records. J. Gerontol. Nurs. 48, 57–64. doi: 10.3928/00989134-20220309-01

PubMed Abstract | CrossRef Full Text | Google Scholar

Zwijsen, S. A., Depla, M., Niemeijer, A. R., Francke, A. L., and Hertogh, C. (2012). Surveillance technology: an alternative to physical restraints? A qualitative study among professionals working in nursing homes for people with dementia. Int. J. Nurs. Stud. 49, 212–219. doi: 10.1016/j.ijnurstu.2011.09.002

PubMed Abstract | CrossRef Full Text | Google Scholar

Zwijsen, S. A., Niemeijer, A. R., and Hertogh, C. M. P. M. (2011). Ethics of using assistive technology in the care for community-dwelling elderly people: an overview of the literature. Aging Ment. Health 15, 419–427. doi: 10.1080/13607863.2010.543662

PubMed Abstract | CrossRef Full Text | Google Scholar

Keywords: artificial intelligence, technology, dementia, big data, older adults, long-term care, machine learning, privacy

Citation: Berridge C and Grigorovich A (2022) Algorithmic harms and digital ageism in the use of surveillance technologies in nursing homes. Front. Sociol. 7:957246. doi: 10.3389/fsoc.2022.957246

Received: 30 May 2022; Accepted: 26 August 2022;
Published: 16 September 2022.

Edited by:

Stephen Katz, Trent University, Canada

Reviewed by:

Vera Gallistl, University of Vienna, Austria
Nicole Dalmer, McMaster University, Canada

Copyright © 2022 Berridge and Grigorovich. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Clara Berridge, clarawb@uw.edu

These authors have contributed equally to this work and share first authorship

Download