Your new experience awaits. Try the new design now and help us make it even better

POLICY AND PRACTICE REVIEWS article

Front. Virtual Real., 06 October 2025

Sec. Virtual Reality and Human Behaviour

Volume 6 - 2025 | https://doi.org/10.3389/frvir.2025.1645330

This article is part of the Research TopicUnlocking the potential of XR: Shaping a pro-social metaverseView all 5 articles

Shaping the future: principles for policy recommendations for responsible innovation in virtual worlds

Alicia G. Cork&#x;Alicia G. Cork1Mike Richardson
&#x;Mike Richardson1*Heide K. LukoschHeide K. Lukosch2Mohamed KhamisMohamed Khamis3Christopher KatinsChristopher Katins4Veronika KrauVeronika Krauβ5Lauren RuffinLauren Ruffin6Sarah PapazoglakisSarah Papazoglakis7Victoria SanchezVictoria Sanchez8Xueni PanXueni Pan9Michael J. ProulxMichael J. Proulx1Danaë Stanton FraserDanaë Stanton Fraser1
  • 1Department of Psychology, University of Bath, Bath, United Kingdom
  • 2Faculty of Engineering, University of Canterbury, Christchurch, New Zealand
  • 3School of Computing Science, University of Glasgow, Glasgow, United Kingdom
  • 4Department of Computer Science, Humboldt-Universität zu Berlin, Berlin, Germany
  • 5University of Applied Sciences Ansbach, Ansbach, Germany
  • 6Arizona State University, Tempe, AZ, United States
  • 7Independent Contributor, San Francisco, CA, United States
  • 8Independent Researcher, Ansbach, Germany
  • 9Department of Computing, Goldsmiths, University of London, London, United Kingdom

Extended Reality (XR) technologies are beginning to enter the mainstream and have the potential to change the way humans interact with computers on a global scale. As with all powerful tools, XR not only has the potential for enormous good, but also brings in a new set of challenges for policy to guide its innovation and development. This paper draws on a wide range of expertise from academia, research and development, and industry to collectively provide guiding principles for XR policy. The authors began discussions and developed a framework at a workshop at ACM SIGCHI 2024 and the ideas presented here are the result of debate, discussion, refinement, and offer next steps for the development of pervasive XR. We present three main principles for XR policy: Trust, Agency, and Inclusivity, along with a cross-cutting theme of Future-Proofing. Each principle is broken down and we offer example implementations. This paper aims to build upon previous work and efforts for fair and equitable XR for all, and further dialogue towards tangible changes in policy to help guide responsible innovation in virtual worlds.

Introduction

The rapid advancement of Extended Reality (XR) technologies introduces new challenges that necessitate thoughtful policy development guiding its design and implementation. XR encompasses a spectrum of immersive three dimensional (3D) technologies, including: Virtual Reality (VR), where users are fully immersed in a virtual environment typically through a headset; Augmented Reality (AR), where virtual information is overlaid onto the physical world; and Mixed Reality (MR), which blends elements of both VR and AR, using the physical world to augment the virtual experience (Milgram and Kishino, 1994). The transformative potential of XR across industries like education (Lee et al., 2022), healthcare (Wang et al., 2022), warfare (Baughman, 2022), entertainment (Han et al., 2022), and gaming has been discussed extensively in recent years. However, today’s rapid technological advancements and high-profile investments have pushed XR towards the mainstream (Kang et al., 2024), meaning that we are at a critical juncture where these technologies are poised to enter daily life on an unprecedented scale (e.g., through the form factor of smart glasses).

Unlike traditional media interfaces, XR technologies directly mediate human perception through head mounted displays which track eye movements, map physical environments and overlay digital content onto reality (Milgram and Kishino, 1994). These hallmark features of XR - always-on cameras, biometric data collection and hyper-realism - exacerbate traditional 2D online harms, like harassment and abuse, racial inequality, dark patterns (design that draws on behavior models to add functionality not in the user’s best interest, see Gray et al., 2018), whilst presenting new challenges in the form of privacy, security and perceptual manipulation. Face, body, and environmental sensors used to track movements and reactions collect a vast amount of data that can reveal personal and private information about individuals and their private spaces, revealing deeply personal information about users’ emotional states, cognitive processes, and even medical conditions (e.g., Miller et al., 2020). The hyper-realism achievable in virtual worlds can blur the line between reality and simulation, raising essential questions about the authenticity of user experiences, and the power of these technologies for exploitation (Chalmers, 2022; Slater et al., 2020). Previous psychological research has demonstrated the lasting impact of realism of virtual environments on an individual’s emotional states, social beliefs, or behavioral abilities. These capabilities create dual potential: XR has been used effectively for therapeutic goals (e.g., phobia treatment; Freeman et al., 2017); cognitive training (Papaioannou et al., 2022), and to reduce social prejudice (Farmer and Maister, 2017), yet the same mechanisms could simultaneously be harnessed for less altruistic purposes, such as profiling in social VR (Tricomi et al., 2023), or manipulating virtual-physical perception for harm (Tseng et al., 2022).

Regulatory principles promote innovation while also placing conceptual guardrails for teams, helping industry professionals understand how to build future technologies for social good without only focusing on what to avoid (Stilgoe et al., 2013). Agreeing regulatory principles for XR technologies is necessary to help provide a blueprint for the future of XR development. Given the “unknown unknowns” inherent in this rapidly evolving space, principles are crucial for embedding “by-design” thinking into XR technologies and virtual worlds from the offset of product development. The regulatory principles explored in this paper can be used to guide more specific policy regulations, or more localized company principles. Currently, there is a lack of standardization across governments, companies, and independent research bodies as to what XR technologies should look like (Makamara and Adolph, 2022). Initiatives like the “XR Guild”1 motivate designers of XR products to adhere to self-developed ethical principles but address the individual, not the organizational or societal level. This lack of standardization applies to both technical standards and regulatory standards (Yang, 2023), with the exception of a very few initiatives like the XR Safety Initiative’s privacy and safety framework (XRSI, 2020), which provides a guidance on research, design and leadership for privacy in XR.

This paper develops principles upon which regulations could be built, focusing on minimizing the risks of these powerful new technologies, whilst simultaneously aiming to maximize their social benefits (Figure 1). These principles and associated ideas were generated through author collaboration at an interdisciplinary, multistakeholder workshop held at the annual ACM SIGCHI conference in 2024 (for the complete workshop proposal and schedule, see Richardson et al., 2024). We negotiated three primary principles upon which we agreed that future technologies should be built, to encourage responsible innovation within this area: i) trustworthy XR, ii) inclusion as a design process and framework, iii) agency and autonomy of users and non-users. We also embedded future-proofing across the principles with each principle focused on being sustainable, adaptable and flexible, with the integrity to best adapt to the unknown future direction of XR. The principles outlined in this paper focus on an idealized version of what the future could look like–a future defined by trust, agency, and inclusivity. While these principles do not offer an exhaustive roadmap for XR development, each step contributes to responsibly building the shared future we outline in this work.

Figure 1
Diagram illustrating XR Principles structured into four categories:

Figure 1. Principles for extended reality (XR) policy.

Assessment of policy options and implications for XR governance

Principle 1: trustworthy XR

Trust manifests in various forms. At its core, trust refers to the assured reliance on the character, ability, strength and truth of someone or something (Merriam-Webster Dictionary, 2025). As Kroeger (2020) notes, all technology depends on trust by virtue of the fact that “technology” is defined by its complexity, with users not needing to know the precise workings, instead only expecting that opaque mechanisms will bring about desired outcomes. Whilst all technologies can be said to rely on trust mechanisms, XR technology faces particular vulnerabilities in terms of the uncertainties around the unique affordances posed by the technology. These include: unique privacy and security challenges associated with the vast capture and processing of sensitive biometric and spatial data, unique authenticity challenges associated with avatar representations and the origination of content, platform governance questions surrounding the use of data for emotion-recognition or behavioral prediction, psychological addiction, and finally significant gaps in understanding about the long and short-term effects of a technology which fundamentally aims to manipulate human perception. Creating trustworthy sociotechnical systems requires processes which reduce the need for “blind trust”, promoting transparency, accountability and reliability in the technology.

A fundamental factor required to instill and embed trust in sociotechnical systems is transparency - ensuring that operations of an XR system are visible and understandable to users. Transparency spans many important areas in XR technology, from advertising content disclosure, to research on the positive and negative longer-term impacts of technological use, and data handling practices. Fundamentally, transparency allows users and regulators to “look under the hood” and understand the power dynamics controlling XR technology.

One of the key challenges associated with embedding transparency into XR design is the difficulty in balancing user experiences with privacy, security and safety concerns. With XR experiences aiming to deliver fully immersive perceptual manipulations, requiring regular consent from users for the tracking of data (similar to Cookie notices online) fundamentally disrupts the intended purpose of the technology. Additionally, with XR being at the cutting edge of technological development, data that may be processed and collected today may in future become much more sensitive due to enhancements in artificial intelligence (AI) based tools for data processing capabilities. Whilst this challenge surrounding the increased capabilities of AI processing is present for other two dimensional (2D) technologies, the volume of data collected through XR systems for successful operation of the technology serves to multiply this risk significantly. Thus, design principles for responsible innovation in XR tread this challenging line between preparing for future advancements in technology, whilst simultaneously grappling with the business models and priorities of the current day.

Below we outline three key dimensions based on these dimensions: i) privacy and security challenges, ii) the authenticity of users and content, and iii) the longitudinal impacts of the technology.

First, a trustworthy system prioritizes privacy, protecting sensitive data and respecting both user and bystander confidentiality (Harborth and Pape, 2021). The convergence of immersive technology capabilities with inadequate transparency mechanisms creates visceral privacy violations where users' most intimate biometric, spatial, and behavioral data is collected continuously without their awareness or meaningful consent. Unlike conventional interfaces where data collection occurs through visible forms or clicks, XR devices continuously capture involuntary biometric data (e.g., eye movements, pupil dilation, micro-expressions, and body language) that users cannot consciously control or even perceive being collected (McGill, 2021; Plopski et al., 2022). The lack of conscious control or perceptibility of this tracking raises significant questions for transparency, in terms of how to educate, inform and ensure that users are aware and understand how the technology operates.

Additionally, the spatial mapping capabilities of AR devices create persistent 3D models of private spaces, capturing not just user data but environmental information about homes, workplaces, and bystanders without explicit consent mechanisms (O’Hagan et al., 2023; Syed et al., 2022). The collection of this significant volume of data presents challenges that extend beyond questions of individual-level privacy, and into collective challenges that must be addressed at a societal level (e.g., Abraham et al., 2024; McGill, 2021; Pahi and Schroeder, 2023). In addition to this, the enhancement of vulnerability of both users and bystanders, especially in terms of the potential misuse of the data (Corbett et al., 2024), continues to have profound impacts on marginalized communities who have historically been disproportionately harmed by data collection practices and privacy practices in the public and private sector (Electronic Privacy Information Center, 2024).

Whilst the importance of privacy and security within XR technologies is not contended, the appropriate implementation of these values comes with challenges and balances. For example, sensitive data is frequently used for identity authentication, e.g., using fingerprint technology to authorize digital payment systems. The key to privacy-preserving XR is to find the balance between privacy enhancing mechanisms and allowing the functionality of the device (Wilson et al., 2024). By prioritizing privacy early on in the development and the design of the technology, this can embed trust in XR systems whereby companies are transparent about the collection and handling of data and use clear data governance principles to guide the responsible collection, usage, and sharing of information.

Second, a trustworthy system promotes authenticity, verifying the validity of information and interactions. Authenticity in XR can be a difficult concept, as advances in the technology lead to greater “perceptual realism” (Slater et al., 2020) resulting in key philosophical questions regarding the “realness” of immersive experiences (Chalmers, 2022). Specifically, it is believed that in time AR will “empower users, communities, businesses, governments and others to alter, augment, diminish or otherwise mediate our perception of reality” (O’Hagan et al., 2023). The notion of authenticity therefore becomes confounded by an understanding of what “real” and “authentic” are in digitally enhanced experiences as presence and immersion might cause long-term effects on the user’s physical body and mental state (Krauss et al., 2024; Slater et al., 2020). Additionally, the real-time rendering of virtual objects in physical spaces through AR raises fundamental questions about content authenticity; lack of transparency about whether visual information originates from advertisers, platforms, other users, or malicious actors attempting perceptual manipulation is central to protecting the perceptual realities of users (Krauss et al., 2024).

At a fundamental level, there is a moral obligation that information and interactions within the metaverse can be relied upon to make judgments about the world or alternatively are delineated as for entertainment purposes only. Whilst some researchers have called for lower realism in virtual worlds, encouraging a stronger delineation between the real and the virtual (e.g., Colburn et al., 2024), here we propose the importance of transparency such that it is clear where information has originated (advertiser, platform, another user, etc.). In some cases, it is vitally important for the effective operating of the technology to successfully merge the real and virtual. For example, training simulations are likely more representative with realistic environments, whereas lower realism can reduce task complexity (Ragan et al., 2015); a key issue when using XR for simulating complex tasks. By prioritizing transparency, rather than reducing functionality, we can hope to maintain the benefits of realism in virtual worlds, whilst simultaneously acknowledging the potential of the technology to be co-opted for nefarious or anti-social purposes (for a more detailed discussion on the ethics of realism, see Slater et al., 2020).

Third, a trustworthy system is built upon evidence-based decision-making where policies and practices are grounded in reliable, scientifically-validated information rather than conjecture or bias. This is especially relevant when dealing with rapidly evolving XR technologies that offer powerful affordances like immersive sensory input, embodied interaction, real-time behavioral tracking, and spatially anchored experiences. Longitudinal and open research is critical to ensuring XR technologies are designed with evidence about their potential effects at the forefront. However, the rapid development of XR technologies makes longitudinal studies notoriously difficult, with hardware specifications and software updates changing between study waves. To address this, the field must embrace open science practices adapted to long-term designs, such as modular pre-registration, phased data sharing, and transparent reporting, to enhance replicability and guard against selective reporting (e.g., Petersen et al., 2024). Moreover, XR’s built-in data capture capabilities offer promising avenues for longitudinal tracking, provided that standardized, scalable frameworks are developed, and data usually held in industry silos is shared with independent researchers. It is only through sustained, methodologically rigorous longitudinal evidence can policymakers and practitioners ensure that XR’s immersive affordances translate into safe, effective, and equitable technology outcomes.

Principle 2: autonomy and agency in XR

Autonomy and agency ensure that users retain control over their immersive experiences and personal data. As XR technologies become more advanced and pervasive, the potential for manipulation and exploitation increases. This manipulation could take many different forms, including manipulating users towards decisions and choices that may not be in their best interests, manipulating users to purchase specific products, and perhaps more significantly a wider manipulation of sociopolitical realities (Krauss et al., 2024). As noted in the trust section, it is crucial to support transparent practices to protect both user interests and the shared integrity of cross-reality environments, with an increased focus on authentication and authenticity.

Building on the transparency requirements needed to instill trust as discussed above, autonomy and agency refers to the ability of users and non-users to make informed decisions and enact these decisions. One common example in this domain relates to the availability of clear opt-out choices for data tracking and storage, although other innovative personalization choices like control over how close other avatars can approach in VR settings or if one’s physical appearance can be digitally altered in AR settings, must be promoted and designed to ensure they are easy to use (Fiani et al., 2024). For the success of XR technology, users must be able to navigate and utilize virtual environments on their own terms. Individuals, whether they are users or non-users, should be given meaningful choices and setting options to personalize their own virtual experiences in relation to themselves, their environment, and others (users or non-users of XR). Examples of meaningful choices may include how users present themselves online, what data is collected on them and how it is used, as well as more specific options such as how close other individuals can approach (e.g., Meta’s personal boundary system) and how they can interact with each other.

In their 2025 review, Huang and colleagues focused on more domain specific examples of personalization through systematically evaluating current developments in personalized smart immersive XR environments (PSI-XR; Huang et al., 2025). They identified personalization options that span across multiple domains, including e-commerce, entertainment and art, education, sports and training, healthcare and avatar creation. Specifically, they note personalization opportunities such as games which incorporate user fears and responses, personalized healthcare virtual agents which respond and adapt to user inputs and personalized assistance in sports training which operate through evaluation of user motion. In their review, Huang also detail a number of data collection mechanisms and processing capabilities that can be used to enhance this personalization, including using eye-tracking and blinking to determine user attention and interest; facial expression recognition to capture emotional states and perceived difficulty; natural language processing (NLP) to understand users’ intentions, feedback and emotional state; and, physiological signals to assess stress. Whilst all these data collection and processing methods provide an incredible ability to personalize and change the immersive experience of the user, there is little discussion with regards to the agency of the user in determining how or whether their experience is being mediated by imperceptible bodily cues. If users are unaware of the personalization that is occurring within their own XR experiences, this opens the door for significant asymmetrical power imbalances between the users and developers of the technology.

XR technology should enable users to receive a valuable and enjoyable XR experience, independent of their choices. This includes that XR designers ensure that only designated users have access to appropriate or specific content and are aligned with the user’s needs and abilities (The XRSI Privacy and Safety Framework - X Reality Safety Intelligence (XRSI), n. d.). Thus, protected user groups may have different choices from the beginning on. Other examples include vision-based personal settings, such as the ability to choose if the surrounding world is either “passed through” as the background of the XR application or if the virtual environment fully envelopes the user. Socio-economical aspects of incorporating and repurposing physical or virtual property as well as public or private spaces need to be considered in this regard, to ensure the autonomy of non-users in XR experiences (Carter and Egliston, 2020).

It is essential to strike a delicate balance between offering users automated methods to protect their privacy, safety, and security, while also preserving their agency to make informed decisions. For instance, in a recent study by Fiani et al. on automated embodied moderation for safeguarding children in Social VR, parents, children, and experts in child psychology and online safety emphasized the critical importance of maintaining children’s agency when developing automated tools to combat harassment in social VR environments (Fiani et al., 2024). Ensuring that users have both the tools and the freedom to navigate XR environments on their terms is fundamental to fostering a safe, equitable, and empowering digital future.

Additionally, at the more extreme end of setting options is the ability to opt in or opt out of different aspects of the technology, especially for non-users or bystanders of the technology. Whilst it is commonly accepted that opting out of particular features is likely to alter the immersive experience, e.g., choosing not to share detailed motion data may affect embodiment processes, innovative solutions should be developed to address legitimate user concerns whilst maximizing user experience (i.e., storing motion data locally, rather than transferring the data to a third party). Of note, whilst users of XR technologies may have consented to using the technology (whether actively or through passively accepting terms of service), bystanders do not have control over others’ usage, despite the fact they may be being inadvertently surveilled or included in XR experiences (e.g., as a live canvas). Whilst similar concerns have arisen in the use of other Internet of Things technologies–e.g., smart doorbells tracking the movements of passersby–XR, and particularly AR, devices pose a more significant threat to bystander privacy by virtue of the unbounded social context in which AR devices may be used, and the power of the AR headset user to utilize bystander data in real-time (e.g., revealing insights about the bystander; O’Hagan et al., 2022).

Agency concerns not only what individuals can do when using XR technologies, but also who they can be. Self-presentation and virtual representation are key to promoting agency and autonomy as they allow individuals to choose digital representations that enable authentic self-expression. In virtual worlds, avatars act as users’ interfaces, embodying expressions and movements, and commonly serving as an extension of self (Manninen and Kujanpaa, 2007). As a result, individuals often develop strong emotional attachments to their avatars (e.g., Cork et al., 2025). Importantly, research on the Proteus effect has demonstrated how behavior and self-perception can be influenced by the characteristics and appearances of avatar representations (e.g., Yee and Bailenson, 2007). In turn then, it is psychologically important for users to have autonomy and agency over how they present themselves in virtual spaces.

Additionally, XR technologies offer considerable creativity in self-presentation, enabling experimentation with gender expression, the choice to conceal or disclose certain features, or the decision to make visible features or disabilities that are usually invisible (e.g., autism; (Freeman and Maloney, 2021). Whilst the extent of customization in XR environments may vary depending on the context: for instance, workplaces might mandate that employees use realistic representations of themselves for professional integrity, whereas social platforms might encourage users to experiment more freely with their avatars, it is important that users are able to selectively convey their identity and navigate their own personal privacy boundaries.

One of the key challenges within discussions of consent and agency in XR technologies relates to situations where users may not be acting in their own best interests. For instance, users might turn to these immersive technologies as an escape from their reality, leading to problematic patterns of overuse or excessive reliance on virtual environments. This raises ethical questions about the responsibility of developers and regulators in protecting vulnerable users from potential harm. While it is important to respect individuals' choices and autonomy, there must also be safeguards in place to prevent exploitation and ensure that users are not encouraged to engage in self-destructive behaviors. This may involve incentivizing business models that are not dependent on maintaining attention. By attempting to align commercial interests with wider individual and societal benefits, it may be possible to establish a new blueprint for XR technologies that builds upon and learns from the challenges faced by Web 2.0 technologies. Importantly, balancing the freedom to explore XR experiences with protective measures against addiction and psychological harm is a complex but necessary aspect of the ongoing discourse on consent and agency in the digital age.

Principle 3: inclusive design

XR technologies are currently at a pivotal point, where new iterations of devices are novel and large design changes are being implemented rapidly, such as built in face and eye tracking, or reduced form factors in smart glasses for augmented reality. This pivotal point presents an opportunity to design for inclusion and accessibility as a fundamental component, rather than an adaptation that is added on later for devices such as smartphones (Mott et al., 2019). At its heart, accessibility is all about creating options for users and anticipating their distinct needs, such that users do not need to generate their own solutions to participate in and benefit from XR experiences. For example, for users with vision-impairments, presenting immersion options that rely instead on tactile, auditory or visual feedback could allow users to participate and use XR technology without feeling that their experience is sub-optimal, or feeling that they need to be responsible for finding alternative options.

XR devices, particularly those targeting augmented or mixed realities, have incredible potential not only to be designed for inclusion and accessibility from the outset but also to act as general-purpose assistive devices for users. For example, XR can display live subtitles or speech bubbles as people around a deaf user communicate or convert the real environment into a high-contrast version for users with low vision. Such features could transform daily experiences for users with disabilities, making the world more navigable and interactive for everyone. Furthermore, because each person’s XR is personal, it is possible for individuals to have access to different levels of information depending on their individual needs. This is a tricky area for policymakers to navigate as different people may require different amounts of data from others to meet their accessibility needs. In the deaf-user example, this would require voice recognition and real-time transcription from AI to implement, but bystanders may not consent to such mass recording. It is worth mentioning that previous work has examined preferences of the public regarding recording for accessibility, and generally this is accepted, if the purpose is for accessibility and conveying only information that a sighted person would have access to (Ahmed et al., 2018). In terms of practical implementation, this might take the form of opting-in to allow your voice to be recognized specifically for the purpose of others’ accessibility, or having XR devices prompt to allow permission if there is an individual nearby who requests audio transcription. A 2022 white paper on the ethics of XR development goes into more detail regarding aforementioned inclusion designs and recommendations (Fox and Thornton, 2022).

While inclusion and accessibility often feature as key principles in other policy frameworks (e.g., Dick, 2021), they are not yet embedded as fundamental within design processes. For example, VR developers reported that they had no experience implementing accessibility features when designing a project, nor were they aware of relevant accessibility guidelines to follow when designing VR products (Zhao et al., 2019). This gap highlights a critical need for structured policies and educational initiatives to ensure developers are equipped to create inclusive technologies.

There are many antecedents to ensuring accessibility and inclusivity even before considering XR design, all of which are sociotechnological or socioeconomic issues in their own right, and beyond the XR focus of this present paper. However, it is worth briefly touching on the main ones, to acknowledge that a society cannot implement an inclusive and accessible XR without first addressing these issues. High-speed internet is a critical infrastructure for XR technology, and its availability is often taken for granted in many places. However, there are significant disparities in internet access worldwide. If XR becomes pervasive in the same way smartphones have, internet infrastructure will need to be upgraded in many regions. Without such upgrades, a power imbalance will emerge between areas with easy access to high-speed internet and those without. This imbalance is already present but will be exacerbated when technology modulates perception of the physical world. Current disparities will become more pronounced, affecting not just access to technology but also education, economic opportunities, and social inclusion. Device cost is another major factor, as XR devices, mobile computing, or computing in general is expensive and cannot be afforded by all. The cost of technology does tend to decrease over time. So, while the cost of brand new, state-of-the-art devices will be initially high, previous iterations of devices become cheaper and retain useful functionality and provide access to XR at a lower cost. This is a sociotechnical balance that exists today, but as with many of the examples in this document, when the dominate technology influences our actual perception of the world, any imbalance of access will be greatly exacerbated. Key to the inclusive design of XR technology is the need for equity and fairness. Diverse design teams and user-centered design approaches that involve diverse users and stakeholders are essential to reduce design bias. An inclusive XR technology that is fair and equitable has the potential to serve the broader population and enhance equity with society by providing additional support and connectivity to those who are presently underserved and/or marginalized. XR (both software and hardware) should be designed and implemented to suit everyone who chooses to use it. A key feature of the Principle of Inclusion is that it should be built into XR from the outset, rather than added post hoc, as is often the trend with innovating technology (Brulé et al., 2019; Botelho, 2021). Furthermore, it is already the case that certain technology is required to actively participate in social norms, such as two-factor authentication and email, and to a lesser extent, social media (Robertson et al., 2023). As a more powerful technology, discrepancy in access to devices or software may prompt inequity. Therefore, policy should legislate for fair and equal access to XR and provide grants or support for communities that need it.

An inclusive XR technology must have cultural specificity. The XR research and development fields are predominantly M-WEIRD (Male, Western, Educated, Industrialized, Rich, and Democratic) (Henrich et al., 2010; Peck et al., 2021). XR is generally developed by M-WEIRD researchers and engineers and tested and evaluated on M-WEIRD participants (Linxen et al., 2021; Peck et al., 2020; Seaborn et al., 2023), despite this group representing a minority in terms of the global population. To make XR technology inclusive, it is important that policy frameworks encourage diverse representation in development teams and consider cultural contexts in design and implementation. This means actively involving individuals from varied backgrounds and ensuring that XR applications are adaptable to different cultural norms and practices.

Inclusive XR technology must conform to the highest standard of human rights. The Universal Declaration of Human Rights (UDHR), established by the United Nations, provides a common standard for individual rights and has been globally adopted and integrated into most major constitutions and treaties (Countries OHCHR, 2025). However, there is no complete consensus on human rights standards, leading to regional and global differences (Ahdanisa and Rothman, 2021). XR technologies, by increasing global interpersonal connections, could amplify these disparities. There is ongoing debate about whether XR requires its own declaration of human rights or if existing conventions can be applied (Charamba, 2022; Cobansoy Hizel, 2023). Therefore, XR policy should consider both the fundamental principles of the UDHR and the potential for varying implementations of these rights worldwide.

An inclusive XR technology must promote societal wellbeing, and contribute to health, education and quality of life for all. If XR becomes a pervasive form of media and imbedded in society, it is crucial that it enhances society and increases wellbeing across multiple sectors. As with smartphone technology, XR has the potential to influence societal wellbeing in both directions, and it is the role of policy to protect against bad consequences. Like smartphones, XR can offer people greater access to exercise, health metrics and meditation, and increase connectivity between loved ones (Fertleman et al., 2018; Navarro-Haro et al., 2017; Noah and Das, 2021), but also increase stress, cause negative experiences, and spread online toxicity (Lavoie et al., 2021; Wiederhold, 2022).

Inclusive XR technology must be democratic and include a wide range of voices and experiences in decision making processes, including those from traditionally disadvantaged backgrounds and groups. Focus groups, participatory workshops and working groups, and public engagement activities can help ensure that the development and use of XR is democratic for all. Policy should provide frameworks for accountability to ensure that XR is for all and maintain transparent decision-making processes.

An inclusive XR technology must be collaborative. The development of XR technology should be a collaborative process, involving stakeholders from different domains, including government, industry, academia, and the public. Collaboration is key to responsible innovation and to ensure the usability of XR applications for all. Through collaborative design and development, stakeholders can address technical, ethical, and social challenges more effectively. For example, partnerships between industrial and educational institutions can lead to the development of applications that improve learning for students (Alnagrat et al., 2022), while collaboration with healthcare providers can result in applications that support patients and providers (Ahmad et al., 2023).

An inclusive XR technology should be designed for public utility. XR also has the potential to improve government-citizen connectivity and open space for greater collaborations in public services. For example, virtual and augmented reality is already used to help visualize planning for building projects (Ergun et al., 2019). Virtual reality is also already used for training and could be extended to real time emergency skill training, such as basic first aid (Yigitbas et al., 2019). Policy can support XR development for public utility by prioritizing funding that serves the public, and by acting as an early adopter of XR technology, governments can align the trajectory of development for one that is accessible by design and maximize the positive benefits of a connected pro-social XR community (Dick, 2021).

An inclusive XR technology must be non-discriminative and aim to provide equal access for developers and users, regardless of background, identity, or circumstances. In short, XR technology must be intersectional to work against matrices of domination in virtual worlds (Crenshaw, 1991). Non-immersive virtual worlds, such as social media, provide both access to wider communities and connection within groups but can also cause friction and conflict between groups (Verduyn et al., 2017). Policies have already been implemented to attempt to reduce online harm (Online Safety, 2023), but online harms are still present across a wide array of online interactions. Discrimination has been reported in existing immersive virtual worlds (Blackwell et al., 2019; Wiederhold, 2022), and as XR continues to grow there is the potential for more harms online.

An essential aspect of inclusive design in XR technologies is ensuring that vulnerable populations, such as children, are adequately protected and can safely navigate these environments. A study by Fiani et al. (2023) explores how children and their guardians perceive the use of a simulated embodied moderating system in social virtual reality. Their findings highlight the importance of designing moderation systems that not only protect children from harm but also respect their autonomy and agency and keep their guardians in the loop. There is a need for XR environments to include child-friendly features and moderation tools that are both effective and aligned with the values and expectations of their guardians.

Future-proofing

The purpose of these principles for policy of XR is to help governments and policymakers get ahead of the trend and be proactive rather than reactive. Over the last 3 decades, each major development in dominant social technology (e.g., internet for the general public, mobile computing, AI) has created new and unique challenges that policy and law have yet to be equipped to govern due to their rapid developments. As policy implementation has been catching up with the technology, this often means that harm occurs to certain users before a safety net is in place, or alternatively, that users feel that freedoms are being removed as policy comes into effect. With technologies such as XR and AI that are orders of magnitude more powerful than mobile computing or the internet, it is critical that policy preempts these fallibilities and legislates early, to aid developers and users to create XR that is trustworthy, inclusive, and agentic. As it is impossible for us to know exactly how the technical landscape will progress, these principles must be futureproof. We see three main components that bind and underpin these principles, if they are to be futureproof. Each of these components can be seen as overarching and relevant to all the principles documented thus far, and without which, would undermine the foundations of each principle.

XR must be sustainable. Interestingly, XR technologies have the potential to aid global sustainability by offering an option for remote participation in meetings, events, and other types of international travel that are both financially and environmentally costly (Lo et al., 2024), while maintaining the benefits from in person meetings, in a way that current online meetings fail to provide (Nesher Shoshan and Wehrt, 2022). XR can also help sustainability education, allowing school classes to journey around the world without actually leaving the class (Prisille and Ellerbrake, 2020). However, the massive demand on power consumption to drive a pervasive XR also has the potential to harm sustainability. Already, the increased use in AI technology is driving up power and water consumption (Georgiou et al., 2022; Schwartz et al., 2020). XR would likely increase the demand for such resources, due to video rendering and streaming, and the likelihood of AI technologies being embedded in XR (Cai et al., 2022). Policy can help ensure that XR is sustainable by encouraging economical algorithms, down rendering when not necessary, and supporting environmentally friendly infrastructure and renewable energy sources.

XR technology and development must have integrity. If everyday interactions happen through this technology, not only remotely, but also in person and capturing data about bystanders, it is critical that suppliers respect the users and protect their data.

For XR policy to meet the demands of the developing technological landscape, it needs to be adaptable and flexible, but without being overly generic or lacking specificity. This is an incredibly difficult balance to strike. By focusing on principles-led regulations rather than rules-based regulations, this enables a level of flexibility that can adapt to changes within the sociotechnical system.

Actionable insights

To address the growing complexity of trust, inclusivity and agency in XR technologies, we propose a set of actionable insights that span key stakeholder groups, including educators, policymakers, designers, industry leaders, users and community networks. XR technologies must be understood as embedded within a broader sociotechnical system, where responsibility for their ethical development and deployment is a shared responsibility. As XR technologies become more widely adopted, educators, innovators and independent regulatory bodies need to work together to ensure that the technology is equitable. This involves designing for more than just the “average” user. Rather, a multifaceted approach to ingraining inclusive design should focus on a combination of industry-led, public-led, policy-led and community-led activism and initiatives. The XRGuild2 is one example of such an initiative, where XR developers (including students and academics) share the same values, and ethical principles, and discuss processes towards inclusive XR technologies. XRSI3 (Extended Reality Safety Intelligence) is a “non-profit organization developing a community-led approach establishing standards, requirements and policies for safe and inclusive XR ecosystems”. However, it is still unclear how these guidelines find their way into research and development of XR technologies. Below, we outline implementation guidelines to realize inclusive, trustworthy, and socially responsible XR technologies.

Protocols and standards for privacy preservation and protection

At the core of trustworthy XR is privacy by design, bolstered by proactive design solutions including privacy enhancing technologies (PETs) such as differential privacy, homomorphic encryption, and system-level rendering techniques that enable systems to produce 3D immersive environments while processing data locally. Apple Vision Pro’s R1 chip architecture demonstrates viable on-device computation that never transmits eye-tracking data externally. The XR Safety Initiative’s Privacy Framework Version 1.0 (XRSI, 2020) provides regulation-agnostic baseline standards incorporating General Data Protection Regulation (GDPR) and Children’s Online Privacy Protection Act (COPPA) requirements, offering a shared language for developers transparently categorizing and disclosing sensitive data. Implementation metrics should track: privacy-by-design certification rates, biometric data breach frequencies, and user consent comprehension scores.

One type of regulation that may promote innovative design is focusing on XR’s business models. By regulating the reach or prominence of advertisements in XR, whilst simultaneously offering incentives for business models that promote societal benefits, it may be possible to financially incentivize technology companies to work with people and their best interests, rather than against them. For example, governments could provide tax incentives or credits to companies that develop XR technologies that enable users to personalize their content, or those with specific safety and accessibility features. Alternatively, independent grants or funding could be allocated to the research and development of user priorities and XR safety. Metrics used to measure the success of tax-based incentives could include the number and percentage of companies receiving tax benefits for safe XR technology development, or the total amount of tax benefits utilized per fiscal year. Additional economic metrics could include grants applied for or awarded, and the number of successful innovations that have stemmed from independent safety-oriented grants.

Mandated assessments, audits and disclosure requirements

Building on the XR Association’s 2023–2024 policy framework calling for flexible and accountable governance, mandated rigorous testing and safety assessments must address the unique risks and challenges posed by XR technologies. However, research analysing 11,923 V R app policies found significant disclosure gaps between practices employed by technology companies and their stated policies (Zhan et al., 2024). Therefore, future implementation should require standardized biometric data disclosure templates with transparency ratings displayed prominently in app stores. Additional disclosure requirements could centre around “reality watermarks” for AR content, distinguishing between platform-generated, advertiser-placed, and user-created virtual objects, or authenticity markers for avatars–indicating human vs. AI control, real-time vs. pre-recorded motion, enhanced vs. natural expressions (Linxen et al., 2021).

Public-private partnerships

To further the evidence base behind ethical XR, collaboration between technology companies, academia, and government agencies to develop best practices, guidelines, and research on safer XR technologies. This could include creating public databases of anonymized longitudinal health data from XR usage (similar to UK Biobank databases) to enable independent and accurate research on longer-term effects of technology use. These partnerships could involve funding initiatives for research into the impacts of XR on mental health and wellbeing. The Future of Privacy Forum’s 2024 XR privacy analysis and XRSI’s multi-stakeholder approach demonstrate viable partnership frameworks. The results from these collaborative initiatives would be used to develop evidence-based ethical guidelines. The success of these partnerships could be measured through the number of research outputs (e.g., papers, patents) resulting from the partnerships, or impact outcomes of the partnerships.

Education and digital literacy

Inclusive awareness campaigns, fostering digital literacy and incentivizing users to make informed choices, are essential to both ethical XR development and safe XR adoption. Awareness campaigns should reach all users, including children, parents, teachers, adults and possible bystanders, offering clearly communicated advice on XR risks and personal controls. These campaigns should teach users how they can personalize their experience to serve themselves, for example, focusing on how to change settings to achieve one’s goals. This would empower citizens to make informed choices about how they can benefit from the technology on their own terms, rather than on the terms of technology companies or other users. Previous initiatives such Consentful Tech4 have been successful in communicating complex technical recommendations and ideas in concise and educational formats. The success of digital literacy campaigns could be measured through involvement in citizen science and user participation research, adoption rates for consent tools and brand loyalty metrics.

Industry-led diversity, equity, and inclusion (DEI) initiatives

DEI initiatives should be industry-led, working cooperatively with diverse groups from around the world and bringing them into the creation processes from the design phase to incorporate and reflect their perspectives and needs. Given the power and potential of XR, policy discourse should be democratized to engage broad publics in shaping the future of XR. For example, XRI5 (XR Inclusion) is a democratic, volunteer-based initiative led by a diverse and global group of XR professionals, diversity and inclusion experts, lawyers, human resource professionals, artists, researchers, and other passionate individuals. XRI gathers, organizes and shares key information and resources to support diversity, equity and inclusion in the XR industry. They work to measure the problems that exist, then develop strategies and initiatives to eliminate them.

It has been suggested that there is currently a lack of comprehensive knowledge surrounding accessibility requirements within design teams in industrial settings. We suggest mandated training to promote understanding of how XR technologies can be designed with awareness of accessibility needs from the beginning. Ideally, this training would encourage incorporating accessibility requirements which extend beyond just minimum standards, and in turn lead to the identification of innovations that benefit all users.

Independent governing body

Modeled on the European Union (EU) AI Act’s tiered risk approach, we suggest the introduction of an independent governing body to oversee the implementation and development of XR technologies, and established a tiered risk classification approach. Importantly, this body should consist of members from a diverse constituency and include both experts and non-experts to ensure that a diverse range of viewpoints are considered in the governance and development of XR. For example, the XR Association6 brings together experts, researchers, developers, and other key stakeholders across fields, to collaborate on the future of XR industry including responsible development, best practices, and XR’s potential. In addition, they also work to educate partners, governments, researchers, and the public about XR, including efforts to anticipate and mitigate challenges in the responsible development and deployment of XR technology. However, the XR Association lacks regulatory authority; formalization similar to the Food and Drug Administration (FDA) for medical devices, or the Federal Communications Commission (FCC) for telecommunications technologies is a key next step in XR development.

Collective and community action

In addition to formal educational campaigns, grass-roots activist efforts involving journalists, cultural workers, and artists are needed to raise general awareness of the technology’s potential and benefits and promote good practices. Such community action can establish social norms that define expectations for XR environments. Across Europe, the Realities in Transition (RiT) project brings together a creative and activist XR community through residencies, XR camps, open-source tools, and a collaborative white paper. These efforts nurture inclusive, sustainable, and socially impactful XR creation, expanding access to diverse voices in immersive storytelling. Beyond institutional initiatives, community-anchored projects grounded in place-based engagement, such as the Thamien Ohlone AR Tour, illustrate how XR can be deployed to empower Indigenous and marginalized communities to tell their own stories and assert narrative sovereignty through immersive, site-specific storytelling.

These initiatives help individuals access and use these new technologies safely, whilst supported by knowledgeable others who can assist them. Importantly, by focusing on community-led campaigns, individuals can become empowered to understand and campaign for what is in their best interests. For example, through setting out the appropriate expectations of XR environments, users may come to expect ownership of their data, or that companies will not use manipulative strategies, or that time-based controls are implemented by default. By supporting collective XR experiences, fostering multi-disciplinary collaboration, and promoting normative expectations around data ownership and non-manipulative systems, grassroots efforts can influence how XR is adopted and governed. Importantly, they help ensure that future users are informed and can thus demand inclusive, ethical controls as a prerequisite for widespread XR adoption.

In summary, fostering trust, inclusion and agency in XR demands a multi-layered strategy: integrating privacy-preserving design and standards (like XR Privacy Framework and XRSI frameworks), ensuring rigorous safety testing, building inclusive governance structures, enabling regulatory and economic incentives for ethical design, driving public education, and supporting community advocacy. Together, these initiatives form a resilient foundation for XR ecosystems aligned with democratic values, accessibility, and shared wellbeing.

Conclusion

XR technology is developing rapidly but is not yet widely adopted. XR, in combination with AI, has the potential to be enormously disruptive to the current technical status quo. Whilst AI regulation has received a great deal of attention over the past few years, XR regulation has fallen out of the regulatory spotlight. In order to help policymakers start considering suitable policy for XR, we gathered together as a group of experts from academia, research, and industry to generate underlying principles for XR policy. Together through group discussion, workshops, and review and reintegration, we generated three primary principles: trustworthy XR, agency, and inclusivity. These principles are held together with the underpinning shared components of ‘future-proofing’: sustainability, integrity, and adaptability. These principles offer a basis for policymakers to commence discussion from, and drawing on previous policy implementation frameworks, we suggest several promising actions to take forward for future consideration. Exploring the creation of independent governing bodies, standardizing disclosure and privacy requirements, and fostering community collaborative groups may provide a strong framework for responsible, user-driven innovation. Carefully managed public-private partnerships can further support pro-social applications of XR. Looking toward the long-term, developing clear protocols and procedures for privacy and protection will help ensure that rapid technological advancements are accounted for, while keeping policy agile enough to foster innovation and safeguard XR for all.

Author contributions

AC: Project administration, Visualization, Methodology, Conceptualization, Writing – review and editing, Writing – original draft. MR: Methodology, Writing – review and editing, Conceptualization, Writing – original draft, Visualization, Project administration. HL: Methodology, Conceptualization, Writing – review and editing, Writing – original draft. MK: Writing – original draft, Writing – review and editing, Methodology, Conceptualization. CK: Conceptualization, Writing – review and editing, Methodology, Writing – original draft. VK: Methodology, Conceptualization, Writing – original draft, Writing – review and editing. LR: Conceptualization, Methodology, Writing – original draft, Writing – review and editing. SP: Writing – original draft, Methodology, Writing – review and editing, Conceptualization. VS: Writing – review and editing, Writing – original draft, Methodology, Conceptualization. XP: Methodology, Writing – original draft, Conceptualization, Writing – review and editing. MP: Writing – review and editing, Writing – original draft, Methodology, Conceptualization. DF: Methodology, Project administration, Writing – review and editing, Funding acquisition, Writing – original draft, Conceptualization.

Funding

The author(s) declare that financial support was received for the research and/or publication of this article. This research was supported by MyWorld, funded by the United Kingdom Research and Innovation, Strength in Places Fund (SIPF00006/1) and EPSRC’s National Research Centre on Privacy, Harm Reduction and Adversarial Influence Online (REPHRAIN), grant number: EP/V011189/1.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

The author(s) declared that they were an editorial board member of Frontiers, at the time of submission. This had no impact on the peer review process and the final decision.

Generative AI statement

The author(s) declare that no Generative AI was used in the creation of this manuscript.

Any alternative text (alt text) provided alongside figures in this article has been generated by Frontiers with the support of artificial intelligence and reasonable efforts have been made to ensure accuracy, including review by the authors wherever possible. If you identify any issues, please contact us.

Publisher’s note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

Footnotes

1www.xrguild.org

2www.xrguild.org

3www.xrsi.org

4www.consentfultech.io

5www.xrinclusion.org

6www.xra.org

References

Abraham, M., Khamis, M., and McGill, M. (2024). “Don’t record my private pARts: understanding the role of sensitive contexts and privacy perceptions in influencing attitudes towards everyday augmented reality sensor usage,” in 2024 IEEE international symposium on mixed and augmented reality (ISMAR) (IEEE), 749–758. doi:10.1109/ISMAR62088.2024.00090

CrossRef Full Text | Google Scholar

Ahdanisa, D. S., and Rothman, S. B. (2021). Revisiting international human rights treaties: comparing Asian and Western efforts to improve human rights. SN Soc. Sci. 1 (1), 16–41. doi:10.1007/s43545-020-00018-0

PubMed Abstract | CrossRef Full Text | Google Scholar

Ahmad, H. F., Rafique, W., Rasool, R. U., Alhumam, A., Anwar, Z., and Qadir, J. (2023). Leveraging 6G, extended reality, and IoT big data analytics for healthcare: a review. In Comput. Sci. Rev., Comput. Sci. Rev. 48, Elsevier Ireland Ltd. doi:10.1016/j.cosrev.2023.100558

CrossRef Full Text | Google Scholar

Ahmed, T., Kapadia, A., Potluri, V., and Swaminathan, M. (2018). Up to a limit? Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2 (3), 1–27. doi:10.1145/3264899

CrossRef Full Text | Google Scholar

Alnagrat, A., Che Ismail, R., Syed Idrus, S. Z., and Abdulhafith Alfaqi, R. M. (2022). A review of extended reality (XR) technologies in the future of human education: current trend and future opportunity. J. Hum. Centered Technol. 1 (2), 81–96. doi:10.11113/humentech.v1n2.27

CrossRef Full Text | Google Scholar

Baughman, J. (2022). Enter the battleverse: china’s metaverse war. Mil. Cyber Aff. 5 (1). Available online at: https://digitalcommons.usf.edu/mca/vol5/iss1/2.

Google Scholar

Blackwell, L., Ellison, N., Elliott-Deflo, N., and Schwartz, R. (2019). “Harassment in social VR: implications for design,” in 26th IEEE conference on virtual reality and 3D user interfaces, VR 2019 - proceedings, 854–855. doi:10.1109/VR.2019.8798165

CrossRef Full Text | Google Scholar

Botelho, F. H. F. (2021). Accessibility to digital technology: virtual barriers, real opportunities. Assist. Technol. 33 (Suppl. 1), 27–34. doi:10.1080/10400435.2021.1945705

PubMed Abstract | CrossRef Full Text | Google Scholar

Brulé, E., Metatla, O., Spiel, K., Kharrufa, A., and Robinson, C. (2019). “Evaluating technologies with and for disabled children,” in Extended abstracts of the 2019 CHI conference on human factors in computing systems (New York, NY: Association for Computing Machinery), 1–6. doi:10.1145/3290607.3311757

CrossRef Full Text | Google Scholar

Cai, Y., Llorca, J., Tulino, A. M., and Molisch, A. F. (2022). Compute-and data-intensive networks: the key to the metaverse. 2022 1st Int. Conf. 6G Netw. 6GNet 2022, 1–8. doi:10.1109/6GNet54646.2022.9830429

CrossRef Full Text | Google Scholar

Carter, M., and Egliston, B. (2020). Ethical implications of emerging mixed reality technologies. doi:10.25910/5EE2F9608EC4D

CrossRef Full Text | Google Scholar

Chalmers, D. J. (2022). Reality+: virtual worlds and the problems of philosophy. Penguin UK.

Google Scholar

Charamba, K. (2022). Beyond the corporate responsibility to respect human rights in the dawn of a metaverse, 30. Miami, FL: University of Miami School of Law Institutional Repository. Available online at: https://repository.law.miami.edu/umiclr/vol30/iss1/5/.

Google Scholar

Cobansoy Hizel, G. (2023). “Metaverse and human rights: do we need metaversal declaration of human rights?” Stud. Big Data, 133. Springer Science and Business Media Deutschland GmbH, 219–229. doi:10.1007/978-981-99-4641-9_15

CrossRef Full Text | Google Scholar

Colburn, B., Macpherson, F., Brown, D., Fearnley, L., Hodgson, C., and McDonnell, N. (2024). Policy and practice recommendations for augmented and mixed reality. doi:10.36399/gla.pubs.326686

CrossRef Full Text | Google Scholar

Corbett, M., David-John, B., Shang, J., Hu, Y. C., and Ji, B. (2024). Securing bystander privacy in mixed reality while protecting the user experience. IEEE Secur. Priv. 22 (1), 33–42. doi:10.1109/MSEC.2023.3331649

CrossRef Full Text | Google Scholar

Cork, A., Smith, L. G. E., Ellis, D., Stanton Fraser, D., and Joinson, A. (2025). Rethinking online harm: a psychological model of contextual vulnerability. doi:10.31234/OSF.IO/Z7RE2

CrossRef Full Text | Google Scholar

Countries OHCHR (2025). Available online at: https://www.ohchr.org/en/countries.

Google Scholar

Crenshaw, K. (1991). Mapping the margins: intersectionality, identity politics, and violence against women of color. Stanf. Law Rev. 43 (6), 1241–1299. doi:10.2307/1229039

CrossRef Full Text | Google Scholar

Dick, E. (2021). Principles and policies to unlock the potential of AR/VR for equity and inclusion. Washington, DC: Information Technology and Innovation Foundation. Available online at: https://itif.org/publications/2021/06/01/principles-and-policies-unlock-potential-arvr-equity-and-inclusion/.

Google Scholar

Ergun, O., Akln, S., Dino, I. G., and Surer, E. (2019). “Architectural design in virtual reality and mixed reality environments: a comparative analysis,” in 26th IEEE conference on virtual reality and 3D user interfaces, VR 2019 - proceedings, 914–915. doi:10.1109/VR.2019.8798180

CrossRef Full Text | Google Scholar

Farmer, H., and Maister, L. (2017). Putting ourselves in another’s skin: using the plasticity of self-perception to enhance empathy and decrease prejudice. Soc. Justice Res. 30 (4), 323–354. doi:10.1007/s11211-017-0294-1

CrossRef Full Text | Google Scholar

Fertleman, C., Aubugeau-Williams, P., Sher, C., Lim, A. N., Lumley, S., Delacroix, S., et al. (2018). A discussion of virtual reality as a new tool for training healthcare professionals. Front. Public Health 6 (FEB), 44. doi:10.3389/fpubh.2018.00044

PubMed Abstract | CrossRef Full Text | Google Scholar

Fiani, C., Bretin, R., Mcgill, M., and Khamis, M. (2023). “Big buddy: exploring child reactions and parental perceptions towards a simulated embodied moderating System for social virtual reality,” in Proceedings of the 22nd annual ACM interaction design and children conference, 1–13. doi:10.1145/3585088.3589374

CrossRef Full Text | Google Scholar

Fiani, C., Bretin, R., MacDonald, S., Khamis, M., and McGill, M. (2024). ““Pikachu would electrocute people who are misbehaving”: expert, guardian and child perspectives on automated embodied moderators for safeguarding children in social virtual reality,” in Conference on human factors in computing Systems - proceedings. doi:10.1145/3613904.3642144

CrossRef Full Text | Google Scholar

Fox, D., and Thornton, I. G. (2022). White paper - the IEEE global initiative on ethics of extended reality (XR) Report--Extended reality (XR) ethics and diversity, inclusion, and accessibility. Available online at: https://ieeexplore.ieee.org/document/9727122.

Google Scholar

Freeman, G., and Maloney, D. (2021). Body, Avatar, and me. Proc. ACM Human-Computer Interact. 4 (CSCW3), 1–27. doi:10.1145/3432938

CrossRef Full Text | Google Scholar

Freeman, D., Reeve, S., Robinson, A., Ehlers, A., Clark, D., Spanlang, B., et al. (2017). “Virtual reality in the assessment, understanding, and treatment of mental health disorders,”Psychol. Med., 47. Cambridge University Press, 2393–2400. doi:10.1017/S003329171700040X

PubMed Abstract | CrossRef Full Text | Google Scholar

Georgiou, S., Kechagia, M., Sharma, T., Sarro, F., and Zou, Y. (2022). “Green AI,” in Proceedings of the 44th international conference on software engineering, 1082–1094. doi:10.1145/3510003.3510221

CrossRef Full Text | Google Scholar

Gray, C. M., Kou, Y., Battles, B., Hoggatt, J., and Toombs, A. L. (2018). “The dark (patterns) side of UX design,” in Proceedings of the 2018 CHI conference on human factors in computing systems, 1–14. doi:10.1145/3173574.3174108

CrossRef Full Text | Google Scholar

Han, D.-I. D., Bergs, Y., and Moorhouse, N. (2022). Virtual reality consumer experience escapes: preparing for the metaverse. Virtual Real. 26 (4), 1443–1458. doi:10.1007/s10055-022-00641-7

CrossRef Full Text | Google Scholar

Harborth, D., and Pape, S. (2021). Investigating privacy concerns related to mobile augmented reality Apps – a vignette based online experiment. Comput. Hum. Behav. 122, 106833. doi:10.1016/j.chb.2021.106833

CrossRef Full Text | Google Scholar

Henrich, J., Heine, S., and Norenzayan, A. (2010). “Most people are not WEIRD,”Nature, 466. Nature Publishing Group, 29. doi:10.1038/466029a

PubMed Abstract | CrossRef Full Text | Google Scholar

Huang, N., Goswami, P., Sundstedt, V., Hu, Y., and Cheddad, A. (2025). Personalized smart immersive XR environments: a systematic literature review. Vis. Comput. 41, 8593–8626. doi:10.1007/s00371-025-03887-9

CrossRef Full Text | Google Scholar

Kang, J., Baek, G. W., Lee, J. Y., Kwak, J., and Park, J. H. (2024). Advances in display technology: augmented reality, virtual reality, quantum dot-based light-emitting diodes, and organic light-emitting diodes. J. Inf. Disp. 25 (3), 219–234. doi:10.1080/15980316.2024.2350437

CrossRef Full Text | Google Scholar

Krauss, V., Saeghe, P., Boden, A., Khamis, M., McGill, M., Gugenheimer, J., et al. (2024). What makes XR dark? Examining emerging dark patterns in augmented and virtual reality through expert Co-Design. ACM Trans. Computer-Human Interact. 31 (3), 1–39. doi:10.1145/3660340

CrossRef Full Text | Google Scholar

Kroeger, F. (2020). “What is trust in technology? Conceptual bases, common pitfalls and the contribution of trust research,” in Trust and Technology Initiative. Available online at: https://www.trusttech.cam.ac.uk/perspectives/technology-humanity-society-democracy/what-trust-technology-conceptual-bases-common.

Google Scholar

Lavoie, R., Main, K., King, C., and King, D. (2021). Virtual experience, real consequences: the potential negative emotional consequences of virtual reality gameplay. Virtual Real. 25 (1), 69–81. doi:10.1007/s10055-020-00440-y

CrossRef Full Text | Google Scholar

Lee, H., Woo, D., and Yu, S. (2022). Virtual reality metaverse system supplementing remote education methods: based on aircraft maintenance simulation. Appl. Sci. Switz. 12 (5), 2667. doi:10.3390/app12052667

CrossRef Full Text | Google Scholar

Linxen, S., Sturm, C., Brühlmann, F., Cassau, V., Opwis, K., and Reinecke, K. (2021). How WEIRD is CHI? Proc. 2021 CHI Conf. Hum. Factors Comput. Syst., 1–14. doi:10.1145/3411764.3445488

CrossRef Full Text | Google Scholar

Lo, T. T. S., Chen, Y., Lai, T. Y., and Goodman, A. (2024). Phygital workspace: a systematic review in developing a new typological work environment using XR technology to reduce the carbon footprint. Front. Built Environ. 10, 1370423. doi:10.3389/fbuil.2024.1370423

CrossRef Full Text | Google Scholar

Makamara, G., and Adolph, M. (2022). “A survey of extended reality (XR) standards,” in ITU kaleidoscope: extended reality – how to boost quality of experience and interoperability. IEEE, 1–11. doi:10.23919/ITUK56368.2022.10003040

CrossRef Full Text | Google Scholar

Manninen, T., and Kujanpaa, T. (2007). The value of virtual assets - the role of game characters in MMOGs. Int. J. Bus. Sci. Appl. Manag. 2 (1), 21–33. doi:10.69864/ijbsam.2-1.8

CrossRef Full Text | Google Scholar

McGill, M. (2021). The IEEE global initiative on ethics of extended reality (XR) Report--Extended reality (XR) and the erosion of anonymity and privacy. Available online at: https://ieeexplore.ieee.org/abstract/document/9619999.

Google Scholar

Merriam-Webster (2025). “Trust,” in Merriam-webster.com dictionary. Available online at: https://www.merriam-webster.com/dictionary/trust August 17, 2025).

Google Scholar

Milgram, P., and Kishino, F. (1994). A taxonomy of mixed reality visual displays. IEICE Trans. Inf. Syst. 12 (12), 1321–1329.

Google Scholar

Miller, M. R., Herrera, F., Jun, H., Landay, J. A., and Bailenson, J. N. (2020). Personal identifiability of user tracking data during observation of 360-degree VR video. Sci. Rep. 10 (1), 17404–17410. doi:10.1038/s41598-020-74486-y

PubMed Abstract | CrossRef Full Text | Google Scholar

Mott, M., Cutrell, E., Gonzalez Franco, M., Holz, C., Ofek, E., Stoakley, R., et al. (2019). “Accessible by design: an opportunity for virtual reality,” in 2019 IEEE international symposium on mixed and augmented reality adjunct (ISMAR-adjunct), IEEE. 451–454.

CrossRef Full Text | Google Scholar

Navarro-Haro, M. V., López-del-Hoyo, Y., Campos, D., Linehan, M. M., Hoffman, H. G., García-Palacios, A., et al. (2017). Meditation experts try virtual reality mindfulness: a pilot study evaluation of the feasibility and acceptability of virtual reality to facilitate mindfulness practice in people attending a mindfulness conference. PLOS ONE 12 (11), e0187777. doi:10.1371/journal.pone.0187777

PubMed Abstract | CrossRef Full Text | Google Scholar

Nesher Shoshan, H., and Wehrt, W. (2022). Understanding “zoom fatigue”: a mixed-method approach. Appl. Psychol. 71 (3), 827–852. doi:10.1111/apps.12360

CrossRef Full Text | Google Scholar

Noah, N., and Das, S. (2021). Exploring evolution of augmented and virtual reality education space in 2020 through systematic literature review. Comput. Animat. Virtual Worlds 32 (3–4), e2020. doi:10.1002/cav.2020

CrossRef Full Text | Google Scholar

Online Safety (2023). Available online at: https://www.legislation.gov.uk/ukpga/2023/50/enacted.

Google Scholar

O’Hagan, J., Saeghe, P., Gugenheimer, J., Medeiros, D., Marky, K., Khamis, M., et al. (2022). Privacy-enhancing technology and everyday augmented reality. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 6 (4), 1–35. doi:10.1145/3569501

CrossRef Full Text | Google Scholar

O’Hagan, J., Gugenheimer, J., Bonner, J., Mathis, F., and McGill, M. (2023). “Augmenting people, places and media: the societal harms posed by everyday augmented reality, and the case for perceptual human rights,” in Proceedings of the 22nd international conference on Mobile and ubiquitous multimedia, 225–235. doi:10.1145/3626705.3627782

CrossRef Full Text | Google Scholar

Pahi, S., and Schroeder, C. (2023). Extended privacy for extended reality: XR technology has 99 problems and privacy is several of them. Notre Dame J. Emerg. Tech. 4, 1. doi:10.2139/ssrn.4202913

CrossRef Full Text | Google Scholar

Papaioannou, T., Voinescu, A., Petrini, K., and Stanton Fraser, D. (2022). Efficacy and moderators of virtual reality for cognitive training in people with dementia and mild cognitive impairment: a systematic review and meta-analysis. J. Alzheimer’s Dis. 88 1341–1370. doi:10.3233/JAD-210672

CrossRef Full Text | Google Scholar

Peck, T. C., Sockol, L. E., and Hancock, S. M. (2020). Mind the gap: the underrepresentation of female participants and authors in virtual reality research. IEEE Trans. Vis. Comput. Graph. 26 (5), 1945–1954. doi:10.1109/TVCG.2020.2973498

PubMed Abstract | CrossRef Full Text | Google Scholar

Peck, T. C., McMullen, K. A., Quarles, J., Johnsen, K., Sandor, C., and Billinghurst, M. (2021). DiVRsify: break the cycle and develop VR for everyone. IEEE Comput. Graph. Appl. 41 (6), 133–142. doi:10.1109/MCG.2021.3113455

CrossRef Full Text | Google Scholar

Petersen, I. T., Apfelbaum, K. S., and McMurray, B. (2024). Adapting open science and pre-registration to longitudinal research. Infant child Dev. 33 (1), e2315. doi:10.1002/icd.2315

PubMed Abstract | CrossRef Full Text | Google Scholar

Plopski, A., Hirzle, T., Norouzi, N., Qian, L., Bruder, G., and Langlotz, T. (2022). The eye in extended reality: a survey on gaze interaction and eye tracking in head-worn extended reality. ACM Comput. Surv. (CSUR) 55 (3), 1–39. doi:10.1145/3491207

CrossRef Full Text | Google Scholar

Prisille, C., and Ellerbrake, M. (2020). Virtual reality (VR) and geography education: potentials of 360° ‘experiences’ in secondary schools. Wiesbaden: Springer VS, 321–332. doi:10.1007/978-3-658-30956-5_18

CrossRef Full Text | Google Scholar

Ragan, E. D., Bowman, D. A., Kopper, R., Stinson, C., Scerbo, S., and McMahan, R. P. (2015). Effects of field of view and visual complexity on virtual reality training effectiveness for a visual scanning task. IEEE Trans. Vis. Comput. Graph. 21 (7), 794–807. doi:10.1109/TVCG.2015.2403312

PubMed Abstract | CrossRef Full Text | Google Scholar

Richardson, M., Cork, A. G., Stanton Fraser, D., Proulx, M. J., Pan, X., Krauß, V., et al. (2024). “Shaping the future: developing principles for policy recommendations for responsible innovation in virtual worlds,” in Extended abstracts of the CHI conference on human factors in computing systems (New York, NY: Association for Computing Machinery), 488, 1–6. doi:10.1145/3613905.3636306

CrossRef Full Text | Google Scholar

Robertson, D. J., Malin, J., Martin, S., Butler, S. H., John, B., Graff, M., et al. (2023). Social media use: attitudes,“detox”, and craving in typical and frequent users. Technol. Mind, Behav. doi:10.31234/OSF.IO/29ZGF

CrossRef Full Text | Google Scholar

Schwartz, R., Dodge, J., Smith, N. A., and Etzioni, O. (2020). Green AI. Commun. ACM 63 (12), 54–63. doi:10.1145/3381831

CrossRef Full Text | Google Scholar

Seaborn, K., Barbareschi, G., and Chandra, S. (2023). “Not only WEIRD but “uncanny”? A systematic review of diversity in human–robot interaction research,” Int. J. Soc. Robotics, 15. Springer Science and Business Media B.V, 1841–1870. doi:10.1007/s12369-023-00968-4

CrossRef Full Text | Google Scholar

Slater, M., Gonzalez-Liencres, C., Haggard, P., Vinkers, C., Gregory-Clarke, R., Jelley, S., et al. (2020). The ethics of realism in virtual and augmented reality. Front. Virtual Real. 1, 1. doi:10.3389/frvir.2020.00001

CrossRef Full Text | Google Scholar

Stilgoe, J., Owen, R., and Macnaghten, P. (2013). Developing a framework for responsible innovation. Res. Policy 42 (9), 1568–1580. doi:10.1016/j.respol.2013.05.008

CrossRef Full Text | Google Scholar

Syed, T. A., Siddiqui, M. S., Abdullah, H. B., Jan, S., Namoun, A., Alzahrani, A., et al. (2022). In-depth review of augmented reality: tracking technologies, development tools, AR displays, collaborative AR, and security concerns. Sensors 23 (1), 146. doi:10.3390/s23010146

PubMed Abstract | CrossRef Full Text | Google Scholar

Tricomi, P. P., Nenna, F., Pajola, L., Conti, M., and Gamberini, L. (2023). You can’t hide behind your headset: user profiling in augmented and virtual reality. IEEE Access 11, 9859–9875. doi:10.1109/ACCESS.2023.3240071

CrossRef Full Text | Google Scholar

Tseng, W. J., Bonnail, E., McGill, M., Khamis, M., Lecolinet, E., Huron, S., et al. (2022). “The dark side of perceptual manipulations in virtual reality,” in Conference on human factors in computing Systems - proceedings. doi:10.1145/3491102.3517728

CrossRef Full Text | Google Scholar

Verduyn, P., Ybarra, O., Résibois, M., Jonides, J., and Kross, E. (2017). Do social network sites enhance or undermine subjective Well-Being? A critical review. Soc. Issues Policy Rev. 11 (1), 274–302. doi:10.1111/sipr.12033

CrossRef Full Text | Google Scholar

Wang, G., Badal, A., Jia, X., Maltz, J. S., Mueller, K., Myers, K. J., et al. (2022). Development of metaverse for intelligent healthcare. Nat. Mach. Intell. 4 (11), 922–929. doi:10.1038/s42256-022-00549-6

PubMed Abstract | CrossRef Full Text | Google Scholar

Wiederhold, B. K. (2022). Sexual harassment in the metaverse. Cyberpsychology, Behav. Soc. Netw. 25 (8), 479–480. doi:10.1089/cyber.2022.29253.editorial

PubMed Abstract | CrossRef Full Text | Google Scholar

Wilson, E., Ibragimov, A., Proulx, M. J., Tetali, S. D., Butler, K., and Jain, E. (2024). Privacy-preserving gaze data streaming in immersive interactive virtual reality: robustness and user experience. IEEE Trans. Vis. Comput. Graph. 30, 2257–2268. doi:10.1109/TVCG.2024.3372032

PubMed Abstract | CrossRef Full Text | Google Scholar

XRSI (2020). The XRSI privacy and safety framework. Available online at: https://xrsi.org/publication/the-xrsi-privacy-framework.

Google Scholar

Yang, L. (2023). Recommendations for metaverse governance based on technical standards. Humanit. Soc. Sci. Commun. 10 (1), 253–10. doi:10.1057/s41599-023-01750-7

CrossRef Full Text | Google Scholar

Yee, N., and Bailenson, J. (2007). The proteus effect: the effect of transformed self-representation on behavior. Hum. Commun. Res. 33 (3), 271–290. doi:10.1111/j.1468-2958.2007.00299.x

CrossRef Full Text | Google Scholar

Yigitbas, E., Heindörfer, J., and Engels, G. (2019). A context-aware virtual reality first aid training application. ACM Int. Conf. Proceeding Ser., 885–888. doi:10.1145/3340764.3349525

CrossRef Full Text | Google Scholar

Zhan, Y., Meng, Y., Zhou, L., Xiong, Y., Zhang, X., Ma, L., et al. (2024). “VPVet: vetting privacy policies of virtual reality apps,” in Proceedings of the 2024 on ACM SIGSAC conference on computer and communications security, 1746–1760.

Google Scholar

Zhao, Y., Cutrell, E., Holz, C., Morris, M. R., Ofek, E., and Wilson, A. D. (2019). SeeingVR: a set of tools to make virtual reality more accessible to people with low vision. Conf. Hum. Factors Comput. Syst. - Proc., 1–14. doi:10.1145/3290605.3300341

CrossRef Full Text | Google Scholar

Keywords: extended reality, principles for policy, policy recommendations, immersive technology, virtual reality

Citation: Cork AG, Richardson M, Lukosch HK, Khamis M, Katins C, Krauβ V, Ruffin L, Papazoglakis S, Sanchez V, Pan X, Proulx MJ and Stanton Fraser D (2025) Shaping the future: principles for policy recommendations for responsible innovation in virtual worlds. Front. Virtual Real. 6:1645330. doi: 10.3389/frvir.2025.1645330

Received: 11 June 2025; Accepted: 05 September 2025;
Published: 06 October 2025.

Edited by:

Jonathan Giron, Reichman University, Israel

Reviewed by:

Weiya Chen, Huazhong University of Science and Technology, China
Jason Robert Rameshwar, The University of the West Indies St. Augustine, Trinidad and Tobago

Copyright © 2025 Cork, Richardson, Lukosch, Khamis, Katins, Krauβ, Ruffin, Papazoglakis, Sanchez, Pan, Proulx and Stanton Fraser. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Mike Richardson, bS5yaWNoYXJkc29uQGJhdGguYWMudWs=

These authors have contributed equally to this work and share first authorship

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.