Skip to main content

OPINION article

Front. Psychol., 02 November 2023
Sec. Emotion Science
This article is part of the Research Topic Exploring the Emotional Landscape: Cutting-Edge Technologies for Emotion Assessment and Elicitation View all 7 articles

Apple Vision Pro: a new horizon in psychological research and therapy

\r\nZhihui Zhang
Zhihui Zhang*Lluis Gimnez MateuLluis Giménez MateuJosep M. FortJosep M. Fort
  • Escola Tècnica Superior d' Arquitectura de Barcelona, Universitat Politècnica de Catalunya, Barcelona, Spain

Virtual Reality (VR) harbors immense potential for advancing psychological therapy and emotional research, despite presenting several challenges (Riva, 2022). The release of Apple Vision Pro has unveiled new opportunities in the realm of psychological emotion research, particularly with its facial expression system enabling facial emotion recognition within a virtual reality environment. We will elucidate the new perspectives Apple Vision Pro brings to Psychological Research and Therapy by delving into its Multi-Sensor Technology, High Resolution, and Remote Scene Meeting Capabilities.

Multi-sensor advancements for VR facial emotion recognition

Measuring emotions in virtual reality (VR) has mainly involved using electroencephalogram (EEG) devices (Suhaimi et al., 2020; Pinilla et al., 2021). However, the conflict between the mobility associated with VR experiences and the stillness required for reliable EEG measurements casts doubts on the viability of EEG-based VR emotion measurement. The limitations of EEG-based emotion measurement in VR have led to the exploration of alternative methods for more reliable emotion recognition. One such alternative emerged with the introduction of facial recognition technologies, which promise a less obtrusive and more natural means of gauging emotional responses compared to EEG. The pivotal moment arrived with the debut of Fove 0 in 2017, which pioneered the incorporation of eye-tracking technology in VR, marking the onset of a new era in VR Emotion Recognition technology. The advent of eye-tracking provided a fresh perspective and an additional layer of data to better understand users' emotional responses within virtual environments. Since then, a plethora of devices, each with their distinct features, capabilities, and limitations for research applications, have emerged, as detailed in Table 1.

TABLE 1
www.frontiersin.org

Table 1. Comparison of technical specifications and tracking capabilities of virtual reality devices.

However, the reliance on eye-tracking alone has shown intrinsic limitations, particularly in capturing the full spectrum of emotional and psychological reactions (Levitan et al., 2022). The intersection of facial recognition technology and VR is gaining traction, aiming to provide a more nuanced understanding of emotional responses within virtual realms. Devices like Meta Quest Pro attempted to integrate facial recognition technologies but faced technical and applicative constraints, limiting their utility in comprehensive emotion recognition research. In contrast, the 2023 launch of Apple Vision Pro represents a significant leap forward by seamlessly combining high-fidelity sensor technology with facial recognition, unlocking new potential for emotion recognition in VR. Uniquely, Apple Vision Pro utilizes individualized facial scans of each user, as opposed to the generalized facial simulation technology employed by other notable devices like those from Meta and Vivo (Zhang et al., 2023). With an intricate array of 12 cameras and 5 sensors, Apple Vision Pro ensures nuanced tracking of facial expressions, presenting a detailed framework for emotion detection. This robust sensor framework not only ensures accurate and personalized facial recognition and expression transmission, marking a substantial advancement in VR facial emotion recognition.

Enhanced realism for experimental and therapeutic applications

Traditional psychological experiments often grapple with the challenge of ecological validity due to the artificial laboratory settings. The Apple Vision Pro, boasting a superior resolution of 3680x3140, offers remarkable scene simulation capabilities, enabling more realistic and immersive environments for both experimental and therapeutic applications. This heightened realism, facilitated by the device's high-definition rendering, paves the way for more authentic and valid research outcomes. For instance, in exposure therapy, therapists can harness the Apple Vision Pro to simulate real-life scenarios in a controlled yet authentic setting, aiding patients in gradually confronting and overcoming their fears (Carlin et al., 1997; Krijn et al., 2004; Barrett et al., 2023). The enhanced resolution and field of view not only deepen users' immersion in highly detailed and broad virtual environments but also significantly improve the ecological validity of psychological and emotional research within VR, heralding a promising trajectory for future innovations and applications in VR-based emotional and behavioral studies.

Potential of 3D rendering and facial expressions in telepsychotherapy

Telepsychotherapy has brought convenience to those unable to undergo face-to-face therapy. However, traditional methods like video calls fall short in terms of realism and immersive experience (Kocsis and Yellowlees, 2018; Poletti et al., 2020; Rosen et al., 2020). The introduction of Apple Vision Pro has revolutionized this domain. Its capability to render 3D scenes and characters, coupled with the realistic portrayal of facial expressions, offers immense potential in remote psychotherapy. Such detailed representation allows for a genuine, immersive therapeutic environment, providing a tangible interactive space for both therapists and patients. The realistic depiction of facial expressions, critical for psychotherapy, enables therapists to gauge patients' emotions and responses accurately, thus potentially enhancing the effectiveness of remote therapeutic interventions (Wiederhold and Wiederhold, 2009).

Confronting ethical and operational challenges

Apple's products are known for their maturity and stability. The Apple Vision Pro, despite its high performance, comes with a hefty price tag, making it unaffordable for many. Additionally, the use of individualized facial scans presents two challenges: increased operational difficulty and time in conducting related research experiments, and privacy concerns compared to using virtual avatars. Addressing these issues while capitalizing on the technological advancements of VR emotion recognition remains a critical task for the industry.

Conclusion

The Apple Vision Pro serves as a beacon of advancement in psychological research and therapy, integrating high-fidelity sensor technology and realistic 3D rendering. Looking ahead, further technological enhancements of Apple Vision Pro might include more precise emotion recognition algorithms and real-time data analytics, offering refined tools for understanding human emotions and behavior in virtual environments. The potential integration of these advancements could bolster telepsychotherapy and exposure therapy, among other therapeutic applications, providing more personalized and effective treatment solutions.

The technological prowess of Apple Vision Pro not only addresses current challenges in the field but also lays a foundation for future exploration. As the realms of technology and psychological sciences continue to merge, the potential for new, innovative methodologies in therapy and research expands. The Apple Vision Pro represents a significant stride toward a tech-driven era in psychological research and therapy, with its potential to unveil deeper insights into the human psyche and contribute to the betterment of mental health services worldwide.

Author contributions

ZZ: Conceptualization, Writing—original draft, Writing—review & editing. JF: Methodology, Supervision, Writing—review & editing. LG: Methodology, Supervision, Writing—review & editing.

Funding

The author(s) declare that no financial support was received for the research, authorship, and/or publication of this article.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher's note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

References

Barrett, A., Pack, A., Monteiro, D., and Liang, H. N. (2023). Exploring the influence of audience familiarity on speaker anxiety and performance in virtual reality and real-life presentation contexts. Behav. Inf. Technol. 1–13. doi: 10.1080/0144929X.2023.2186145

CrossRef Full Text | Google Scholar

Carlin, A. S., Hoffman, H. G., and Weghorst, S. (1997). Virtual reality and tactile augmentation in the treatment of spider phobia: a case report. Behav. Res. Ther. 35, 153–158. doi: 10.1016/S0005-7967(96)00085-X

PubMed Abstract | CrossRef Full Text | Google Scholar

Kocsis, B. J., and Yellowlees, P. (2018). Telepsychotherapy and the therapeutic relationship: Principles, advantages, and case examples. Telemed J. E. Health 24, 329–334. doi: 10.1089/tmj.2017.0088

PubMed Abstract | CrossRef Full Text | Google Scholar

Krijn, M., Emmelkamp, P. M., Olafsson, R. P., and Biemond, R. (2004). Virtual reality exposure therapy of anxiety disorders: a review. Clin. Psychol. Rev. 24:259–281. doi: 10.1016/j.cpr.2004.04.001

CrossRef Full Text | Google Scholar

Levitan, C. A., Rusk, I., Jonas-Delson, D., Lou, H., Kuzniar, L., Davidson, G., et al. (2022). Mask wearing affects emotion perception. Iperception. 13, 91. doi: 10.1177/20416695221107391

PubMed Abstract | CrossRef Full Text | Google Scholar

Pinilla, A., Garcia, J., Raffe, W., Voigt-Antons, J. N., Spang, R. P., and Möller, S. (2021). Affective visualization in virtual reality: an integrative review. Fron. Virt. Real. 2, 630731. doi: 10.3389/frvir.2021.630731

CrossRef Full Text | Google Scholar

Poletti, B., Tagini, S., Brugnera, A., Parolin, L., Pievani, L., Ferrucci, R., et al. (2020). Telepsychotherapy: a leaflet for psychotherapists in the age of covid-19. A review of the evidence. Couns. Psychol. Q. 34, 345–347. doi: 10.1080/09515070.2020.1769557

CrossRef Full Text | Google Scholar

Riva, G. (2022). Virtual reality in clinical psychology. Compreh. Clini. Psychol. 91, 6. doi: 10.1016/B978-0-12-818697-8.00006-6

CrossRef Full Text | Google Scholar

Rosen, C. S., Glassman, L. H., and Morland, L. A. (2020). Telepsychotherapy during a pandemic: a traumatic stress perspective. J. Psychother. Integr. 30, 174–187. doi: 10.1037/int0000221

CrossRef Full Text | Google Scholar

Suhaimi, N. S., Mountstephens, J., and Teo, J. (2020). Eeg-based emotion recognition: a state-of-the-art review of current trends and opportunities. Comput. Intell. Neurosci. 2020, 426. doi: 10.1155/2020/8875426

PubMed Abstract | CrossRef Full Text | Google Scholar

Wiederhold, B. K., and Wiederhold, M. D. (2009). A review of virtual reality as a psychotherapeutic tool. CyberPsychol. Behav. 1, 45–52. doi: 10.1089/cpb.1998.1.45

CrossRef Full Text | Google Scholar

Zhang, Z., Fort, J. M., and Giménez Mateu, L. (2023). Facial expression recognition in virtual reality environments: challenges and opportunities. Front. Psychol. 14, 1280136. doi: 10.3389/fpsyg.2023.1280136

PubMed Abstract | CrossRef Full Text | Google Scholar

Keywords: Apple Vision Pro, virtual reality, telepsychotherapy, emotion recognition, psychological experiments

Citation: Zhang Z, Giménez Mateu L and Fort JM (2023) Apple Vision Pro: a new horizon in psychological research and therapy. Front. Psychol. 14:1280213. doi: 10.3389/fpsyg.2023.1280213

Received: 19 August 2023; Accepted: 17 October 2023;
Published: 02 November 2023.

Edited by:

Ivonne Castiblanco Jimenez, Polytechnic University of Turin, Italy

Reviewed by:

Elena Carlotta Olivetti, Polytechnic University of Turin, Italy

Copyright © 2023 Zhang, Giménez Mateu and Fort. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Zhihui Zhang, zhihui.zhang@upc.edu

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.