Sec. Data Visualization
Volume 2 - 2022 | https://doi.org/10.3389/fbinf.2022.873478
A Brave New World: Virtual Reality and Augmented Reality in Systems Biology
- 1Department of Genetics and Genomic Sciences, Icahn School of Medicine at Mount Sinai, New York, NY, United States
- 2Faculty of Natural Sciences and Engineering, Sabancı University, Istanbul, Turkey
- 3Precision Immunology Institute, Icahn School of Medicine at Mount Sinai, New York, NY, United States
How we interact with computer graphics has not changed significantly from viewing 2D text and images on a flatscreen since their invention. Yet, recent advances in computing technology, internetworked devices and gaming are driving the design and development of new ideas in other modes of human-computer interfaces (HCIs). Virtual Reality (VR) technology uses computers and HCIs to create the feeling of immersion in a three-dimensional (3D) environment that contains interactive objects with a sense of spatial presence, where objects have a spatial location relative to, and independent of the users. While this virtual environment does not necessarily match the real world, by creating the illusion of reality, it helps users leverage the full range of human sensory capabilities. Similarly, Augmented Reality (AR), superimposes virtual images to the real world. Because humans learn the physical world through a gradual sensory familiarization, these immersive visualizations enable gaining familiarity with biological systems not realizable in the physical world (e.g., allosteric regulatory networks within a protein or biomolecular pathways inside a cell). As VR/AR interfaces are anticipated to be explosive in consumer markets, systems biologists will be more immersed into their world. Here we introduce a brief history of VR/AR, their current roles in systems biology, and advantages and disadvantages in augmenting user abilities. We next argue that in systems biology, VR/AR technologies will be most useful in visually exploring and communicating data; performing virtual experiments; and education/teaching. Finally, we discuss our perspective on future directions for VR/AR in systems biology.
We see the world in three dimensions (3D), because we have binocular vision, meaning our left and right eyes see slightly different views of an object-a topic explored since Euclid’s 3rd century BC Optics and physician Galen’s 2nd century AD On the Use of the Different Parts of the Human Body, to da Vinci’s (1452–1519) Trattato della Pittura (Art of Painting). The first device that produced 3D-effects utilized binocularity by using two mirrors centered at 45° reflecting specific images to each eye. Invented in the 1830s by Charles Wheatstone, it was called the reflecting mirror stereoscope, from the Greek skopion and stereo (see solid). Conceptually, Virtual Reality (VR) was first described by computing pioneer Sutherland (Sutherland, 1965), who created arguably the first VR head-mounted display (HMD)—a large and bulky device that required mounting to the ceiling and could cause severe injury if it fell on a user, which was nicknamed the Sword of Damocles (Sutherland, 1968). However, the actual term Virtual Reality was coined later in the 1980s, when Visual Programming Lab of computer scientist Jaron Lanier began producing the first commercially available VR headsets and gloves. Since then, efforts to integrate the human body naturally into the virtual experience have driven significant advances in VR and Augmented Reality (AR) human-computer interfaces (HCIs), which mainly consist of input devices, output displays, and various hardware and software parts. We summarize these technologies in Table 1.
Notable within these efforts, CAVE Automatic Virtual Environments (CAVEs) were introduced in the 90s (Cruz-Neira et al., 1992) by projecting stereoscopic images on walls and floor of a room-sized cube. A CAVE can be thought of like Star Trek’s Holodeck (Chu and Quek, 2013), though the name refers to the metaphor of Plato’s cave in the Republic (Plato and Shorey, 1930) where a philosopher contemplates perception, reality, and illusion. Inside a CAVE, users wear Liquid-Crystal Display (LCD) shutter glasses and a head-tracker and interact with objects using a wand-like device to gain immersion that utilizes a full range of human vision with much wider field-of-view (FOV) and enhanced perceptive depth and shape perception. Later on, for massive data, large, ultra-high-resolution matrices of multiple displays (either monitors or projectors) called Powerwalls were also developed (Papadopoulos et al., 2015). Powerwalls and CAVEs are attractive as collaborative environments in which several investigators can simultaneously interact with VR objects. Yet, they are expensive to build, maintain and upgrade, occupy considerable space, and require many displays and massive processing power, making them cost-prohibitive. Thus, with these available technologies VR was not commercially popular to go mainstream (Wohlgenannt et al., 2020).
Until recently, VR was even considered a “dead technology” (Slater and Sanchez-Vives, 2016). However, recent technological advances, especially in gaming products including Oculus Rift (Menlo Park, CA), HTC VIVE (Taiwan), and PlayStation VR (San Mateo, CA), have enabled VR to finally have good enough performance at relatively cheaper prices, creating a positive feedback loop between companies that develop more advanced technologies and expanding consumer demand, which led to a tremendous jump in VR technology (Wohlgenannt et al., 2020; Kugler, 2021). For context, commercial VR systems in 2016 required expensive and difficult setups including an HMD headset, controllers, and sensors connected to an external high-graphics computer. However, some current new generation VR sets such as Oculus Quest 2 (Menlo Park, CA) and HTC VIVE Focus series (Taiwan), generally referred as stand-alone VR systems, or all-in-one solutions, do not even depend on any external computer system (Wohlgenannt et al., 2020), increasing accessibility further. Coupled with very recent developments such as the popular metaverse concept that combines multiple elements of VR/AR and internet technologies to achieve an extended reality by blending physical and digital worlds together (Lee et al., 2021), we anticipate that VR/AR technologies will finally go mainstream.
VR/AR in Systems Biology
In systems biology, we often seek to provide new insights that weave data on molecules, pathways, cells and tissues to whole organisms, populations, and ecosystems in multiple time- and length-scales. Furthermore, new high-throughput–omics experimental techniques are producing massive and diverse multi-omics datasets (Gillette et al., 2020; Kalayci et al., 2020; Petralia et al., 2020; Marx, 2021; Satpathy et al., 2021), and detailed data capture is boosted further by advances in supercomputers and tracking sensor technologies. Parallel increases in processor speeds and data storage enable computational analyses of these data (Schadt et al., 2010). New or updated visualization technologies are needed to explore and communicate these datasets. While not new in scientific visualizations (Bryson, 1996; Simpson et al., 2000), VR technologies offer novel avenues to these unprecedented new data communication needs. However, much remains to be answered on when and how to use VR/AR technologies in systems biology.
To help answer these questions for different use cases, we first provide a summary of the advantages and disadvantages of using VR technologies in Table 2. Notably, their greatest advantage is in providing unparalleled presence—the sense of being inside of and interacting with the virtual environment (Slater and Wilbur, 1997; Schubert et al., 2001; Sutcliffe et al., 2005). Therefore, we recommend utilizing VR/AR technologies when spatial presence and immersive interactivity with the content makes a difference in addressing user needs. User comprehension may become somewhat limited in 2D above domain-specific data type and size thresholds, leading to data occlusions. In such cases, immersion enables 3D-navigation and provides the necessary perceptive depth to enhance comprehension. For example, for multi-dimensional data, user studies have reported significantly better performance for immersive 3D vs. 2D environments for certain analysis tasks (Etemadpour et al., 2013). As researchers working at the interface of biology and data visualization, based on our own experience in this domain, and observations on the general trends in VR applications in biology that we briefly summarize here, we anticipate their utilization in three main areas in systems biology: 1) visually exploring and communicating data; 2) performing virtual experiments; and 3) education/teaching and discuss them below.
Visually Exploring and Communicating Data
Systems biology visualizations are generally abstract representations, far from real-world objects. For instance, cellular pathway graphs are often cartoon representations. Yet, they suffice in helping understand the biological phenomena they represent. In VR environments, users interpret such abstractions as real objects, and the imagery has a lasting impact on our brains. Furthermore, we can interact with the virtual objects in ways that are not possible in the real world at multiple scales, from single molecules (Leinen et al., 2015), protein-drug complexes (Norrby et al., 2015) biomolecular networks (Liluashvili et al., 2017) to organs (Mirhosseini et al., 2014); navigate through them (Bellmund et al., 2016), or dynamics (Nakano et al., 2016). VR technologies thus open the door to benefit diverse phenomena that involve 3D spatial reconstructions, from localizations and dynamics in the human brain (Calì et al., 2016) to interactions in retinal pathology (Aaker et al., 2011) and volumetric studies in digital pathology (Farahani et al., 2016; Liimatainen et al., 2021). More recently, single-molecule localization microscopy (SMLM) in immersive VR has been used to visualize biological structures as point clouds at the molecular scale (e.g., vLUME by Spark et al., 2020). Some of these recent SMLM tools can be further extended to other similarly multidimensional spatially localized point clouds (e.g., Genuage tool by Blanc et al., 2020).
In systems biology, the knowledge, visualization, and exploration of related 3D structural data often play an important role. To explore and understand the function of macromolecules (proteins, RNA, and DNA) and their complexes from their 3D structural models, molecular graphics is shifting to VR (Tse et al., 2011; Nakano et al., 2016; Ratamero et al., 2018; Cassidy et al., 2020; Todd and Emsley, 2021). Some of these applications further integrate relevant structural knowledge with domain-specific (Norrby et al., 2015), or genomics datasets (Zhang et al., 2019). While some molecular viewers work in CAVE environments (Block et al., 2009), others utilize latest HMDs (Leinen et al., 2015), game engines such as Unity and gesture devices such as Kinect and Leap Motion (Probst and Reymond, 2018; Zhang et al., 2019) or even voice recognition (Sabir et al., 2013) to activate commands; or provide web-based VR without head-tracking or advanced interactions (Li et al., 2014; Cassidy et al., 2020). Parallel efforts are also on-going to study the chemical fingerprints of DrugBank compounds in VR environments (Probst and Reymond, 2018).
VR is useful even when exploring abstract systems-level data that do not contain 3D-localizations. For example, researchers often employ complex network visualizations which may require many viewpoints due to clutter, and navigation issues. Several studies suggest stereoscopy alone (Ware and Mitchell, 2008; Greffard et al., 2014; Kwon et al., 2015; Kwon et al., 2016) or combined with rotation (Sollenberger and Milgram, 1993) or motion cues (Ware and Mitchell, 2008) enhances performance in comprehension, helps present graphs better than 2D-displays (Sollenberger and Milgram, 1993) and enables low user error rates (Ware and Mitchell, 2008). For example, Supplementary Figure S1 shows a relatively large network in 2D, and Supplementary Video S1 in 3D. While both are generated using the network visualization tool iCAVE (Liluashvili et al., 2017; Kalayci and Gümüş, 2018), within iCAVE users can further interactively explore the 3D or stereo versions from multiple perspectives. Visualizations from multiple perspectives reportedly make different aspects of a system more salient (Ellis et al., 1991). Similarly, the BigTop tool (Westreich et al., 2020) renders Manhattan plots from genome-wide association studies (GWAS) in 3D.
More recently, VR is used to explore multidimensional -omics datasets in systems biology including cytometry, transcriptomics, epigenomics, proteomics and their combinations, in the form of abstract data clouds. For example, single-cell RNA sequencing data analysis often includes a dimensionality reduction step, where cell populations are projected in 2D or 3D space to explore cellular heterogeneity. Visualizing such datasets in 3D can be more informative as it decreases the possibility of collapsing similar cell types and clusters. A recent tool, CellexalVR, allows visual exploration and analysis of such dimensionality reduction plots and associated metadata in immersive VR (Legetth et al., 2021). Other tools for the same purpose include starmapVR (Yang et al., 2020), singlecellVR (Stein et al., 2021) and Theia (Bressan et al., 2021). These platforms often include additional modalities such as on-the-fly clustering, or visualization of dynamic changes in RNA velocity. While singlecellVR and starmapVR are web applications that enable visualizations using low-cost and easily available VR hardware such as Google Cardboard (Yang et al., 2020; Stein et al., 2021), CellexalVR involves GPU-accelerated performance and in-session on-the-fly calculations. StarmapVR further enables simultaneous visualization of spatial transcriptomics data from matching histological images. We anticipate that in the near future we will witness further developments in VR tools for the visual exploration of spatial transcriptomics datasets.
Performing Virtual Experiments
For research studies that cannot be performed in the real world, VR provides a safe, standardized, and reproducible environment that is as life-like as possible (Tarr and Warren, 2002). In addition, we can break the laws of optics and physics, or disconnect real life sensed by the user’s body from the world the user is experiencing (Tarr and Warren, 2002). Researchers have been using such VR properties to study, modify or enhance behavior. For example, neural processes research that links biology and behavior in different species, from insects to humans (Ravassard et al., 2013; Aghajan et al., 2015; Acharya et al., 2016) has used VR to help understand sensory cues that carry information on the virtual worlds or refine the rules that link a subject’s actions to changes in their world. In addition to assisting in understanding human behavior, research can in turn inform how VR environments can be improved in design for increased human engagement. For example, studies suggest that the socially networked nature of VR should be considered in tool design, as numbers of remote users in virtual spaces increase (Kruzan and Won, 2019; Jeong et al., 2021).
In systems biology, virtual experiments can help develop and improve scientific thinking skills. Virtual reformulations of experiments in the form of games to be solved have already proven efficiency in tackling scientific problems. For example, the tools Foldit (Foldit, 2022) and Eterna (Eterna, 2022) have gamified the protein folding and RNA structure prediction problems (Das et al., 2019), and thereby enabled more individuals to perform virtual experiments by engaging the online gaming community whose members may have little to no scientific knowledge. Yet, these gamers have successfully solved real-world problems in relatively short time scales (Cooper et al., 2010; Khatib et al., 2011; Eiben et al., 2012; Horowitz et al., 2016; Koepnick et al., 2019). Similarly, the tool EyeWire has gamified mapping neural circuits in the brain to understand vision, where players try to virtually map 3D neuron structures to serial electron microscopy image data from animal brains (Das et al., 2019; EyeWire, 2022). This game has so far attracted more than 150 K gamers who helped reveal 6 new neuron types and many undiscovered brain circuits (EyeWire Into the Brain, 2022). VR environments open exciting new possibilities of such gamification of virtual scientific problems both for scientists with little coding experience, as well as for non-scientists, and the communication between these two communities. We have already started to observe the first examples of such tools in computational chemistry, where Shannon et al have introduced molecular dynamics VR game to encourage users to explore the reactivity of a specific chemical system (Shannon et al., 2021), and an intuitive VR platform called Nanome presented by Kingsley et al., where the users explore and modify chemical structures collaboratively to work on structural biology and molecular drug design problems (Kingsley et al., 2019). The next several years will witness an increasing number of tools that gamify virtual experiments in the VR environments. Note that in online gaming communities, VR is increasingly blended with social media functionalities, and thus gamified systems biology VR tools will likely need to consider such additional functionalities that are critical for remote users.
VR environments can help learning in systems biology areas that involve complex 3D information (Salzman et al., 1999; Mikropoulos and Natsis, 2011), user-interactivity and/or high computational skills. For example, understanding protein structure traditionally involves physical modeling kits or projections of the 3D structures into 2D. However, the ability to create, alter, and rotate a chemical structure in real time in 3D can make it easier to understand abstract concepts (Limniou et al., 2008). Similarly, annotated 3D web-based anatomy atlases help teach complex structures such as artery networks or bronchial trees. Earlier interactive 3D-renderings of these systems used desktops with standard screens due to the costs of VR processors and displays (Li et al., 2012), while later technologies have enabled stereoscopic immersive 3D with real-time interactivity (Kockro et al., 2007). Randomized user studies show that stereo-enhanced 3D-tools are useful in learning anatomy and are well-received by students (Kockro et al., 2015; de Faria et al., 2016). Similarly, integrating AR technology reportedly has positive impact on student learning in biology (Weng et al., 2020).
Recent technological advances in VR have substantially increased their potential utility in learning. Biological concepts currently constitute ∼5% of academic publications on VR (Morimoto and Ponton, 2021). Relatively popular educational platforms include those that simulate biological and chemical experiments within VR environments, such as VRLab Academy (United Kingdom), Labster (Denmark), and ClassVR (United Kingdom). Advances in gaming have expanded VR applications in education as well. For example, the tool Peppy provides a Unity-based VR gaming engine to understand protein structures and their dynamics in undergraduate biochemistry classes (Doak et al., 2020). Similarly, Pepblock Builder VR tool provides a gamified interface for researchers who are not advanced in the computational skills required for protein design (Yallapragada et al., 2021). Many educational VR experiences exist in popular gaming platforms that recreate biological systems, such as The Body VR, where players move in the bloodstream to observe human cells and learn how organelles function (The Body VR LLC, 2016), and InCell VR, where players fight to stop a virus invasion in human, while learning about cell and organelle microstructures (Luden.io, 2015). We anticipate that gaming-based VR tools will similarly be developed for learning multi-omics datasets at a systems level. Further research will then be necessary to understand the full potential impact of VR in learning systems biology.
The recent explosion in VR/AR technologies has coincided with extended work-from-home practices due to the coronavirus disease 2019 global pandemic. These developments have lowered resistance to virtual technologies and in fact created a pressing need for virtual, collaborative workspaces in research. Coupled with the explosive increase in datasets collected from multi-omics high-throughput experiments, VR/AR technologies offer attractive new opportunities for visual data exploration and communication in systems biology. However, to develop the most useful tools, systems biologists need to conduct their own user studies and get familiar with design practices within virtual spaces for improved human perception (Cleveland and McGill, 1987). With deeper understandings of the brain and visual perception, content creation and best practices will be established, and adoption will increase. Further technological improvements (higher frame rates, efficient information storage and rendering; increased data transfer with less bit rates; game engines; graphics cards) will aid challenging visualizations such as dynamic networks or multi-scale systems, integrated with data annotations and on-the-fly calculations. Visualization outcomes will in turn guide future research protocols.
VR application development is already easier with HMDs and input devices that offer game engine-compatible free software development kits. These render information from internetworked devices that collect and exchange data with sensors and network connections (Rose, 2014; Akyildiz et al., 2015; Open Hybrid, 2022) or from integrating multiple technologies (e.g., healthcare in cyberspace (Rosen et al., 2016)). We anticipate that new technologies will further eliminate discomforts and limitations of modern VR headsets such as their bulkiness and weight. Towards this end, current studies include using skin sensory interfaces, such as nanowire-based soft wearable HMIs (Wang et al., 2021), and thin and light-weight holographic optics with high performance sunglasses-like near-eye full-color displays, as developed by Facebook Reality Labs (Maimone and Wang, 2020). Such new technologies may remove the barriers between the virtual and real worlds further by eliminating headset use, thereby converging VR/AR (Kugler, 2021).
In summary, we are currently at a critical juncture for VR/AR use in systems biology, as they are finally on the verge of going mainstream. We anticipate that the current trends towards utilizing low-cost VR/AR systems will continue. Still, for certain research areas, interactive 3D applications in Web3D will likely suffice. For some applications, AR will be preferable, as it allows users to overlap virtual data onto the real world in relatively simpler set-ups such as smartphones, without the need of HMDs, and provide more control of their surroundings (Garcia-Bonete et al., 2019). VR will likely remain advantageous in applications that require better immersion and realism (Garcia-Bonete et al., 2019). Barriers for access to VR/AR visualizations in systems biology will likely remain, however, for researchers from underdeveloped countries, and which will need to be addressed. At the same time, with the increasing trends in gamification within the VR environments (Shannon et al., 2021), barriers for scientists with low computational expertise and non-scientists in conducting their own virtual experiments will decrease. As user community grows and commercial VR/AR technologies expand, we expect the range of their systems biology applications will also continue to grow, opening possibilities for significant advancements in understanding and communicating disease-associated mechanisms, running virtual experiments, and education, and help boost the development of new therapies. Of course, the best way to gauge possibilities is to explore them!
Data Availability Statement
The original contributions presented in the study are included in the article/Supplementary Material, further inquiries can be directed to the corresponding author.
ZG conceptualized the study, ZG and BT wrote the original draft, edited, and revised the manuscript, ZG was responsible for supervision, project administration, and funding acquisition.
This study was supported by the Concern Foundation Conquer Cancer Now award and Cancer Moonshot R33 award # CA263705-01 to ZG.
Conflict of Interest
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/fbinf.2022.873478/full#supplementary-material
Aaker, G. D., Gracia, L., Myung, J. S., Borcherding, V., Banfelder, J. R., D'Amico, D. J., et al. (2011). Volumetric Three-Dimensional Reconstruction and Segmentation of Spectral-Domain OCT. Ophthalmic Surg. Lasers Imaging 42, S116–S120. doi:10.3928/15428877-20110627-11
Acharya, L., Aghajan, Z. M., Vuong, C., Moore, J. J., and Mehta, M. R. (2016). Causal Influence of Visual Cues on Hippocampal Directional Selectivity. Cell 164, 197–207. doi:10.1016/j.cell.2015.12.015
Aghajan, Z. M., Acharya, L., Moore, J. J., Cushman, J. D., Vuong, C., and Mehta, M. R. (2015). Impaired Spatial Selectivity and Intact Phase Precession in Two-Dimensional Virtual Reality. Nat. Neurosci. 18, 121–128. doi:10.1038/nn.3884
Balakrishnan, B., and Sundar, S. S. (2011). Where Am I? How Can I Get There? Impact of Navigability and Narrative Transportation on Spatial Presence. Human–Computer Interaction 26, 161–204. doi:10.1080/07370024.2011.601689
Blanc, T., el Beheiry, M., Caporal, C., Masson, J. B., and Hajj, B. (2020). Genuage: Visualize and Analyze Multidimensional Single-Molecule point Cloud Data in Virtual Reality. Nat. Methods 17, 1100–1102. doi:10.1038/s41592-020-0946-1
Block, J. N., Zielinski, D. J., Chen, V. B., Davis, I. W., Vinson, E. C., Brady, R., et al. (2009). KinImmerse: Macromolecular VR for NMR Ensembles. Source Code Biol. Med. 4, 3. doi:10.1186/1751-0473-4-3
Bressan, D., Mulvey, C. M., Qosaj, F., Becker, R., Grimaldi, F., Coffey, S., et al. (2021). Exploration and Analysis of Molecularly Annotated, 3D Models of Breast Cancer at Single-Cell Resolution Using Virtual Reality. bioRxiv [Preprint] 28, 448342. doi:10.1101/2021.06.28.448342
Calì, C., Baghabra, J., Boges, D. J., Holst, G. R., Kreshuk, A., Hamprecht, F. A., et al. (2016). Three-dimensional Immersive Virtual Reality for Studying Cellular Compartments in 3D Models from EM Preparations of Neural Tissues. J. Comp. Neurol. 524, 23–38. doi:10.1002/cne.23852
Cassidy, K. C., Šefčík, J., Raghav, Y., Chang, A., and Durrant, J. D. (2020). ProteinVR: Web-Based Molecular Visualization in Virtual Reality. Plos Comput. Biol. 16, e1007747. doi:10.1371/journal.pcbi.1007747
Chu, S. L., and Quek, F. (2013). “Information Holodeck: Thinking in Technology Ecologies,” in Human-Computer Interaction – INTERACT 2013 – Lecture Notes in Computer Science Book Series. Editors P. Kotzé, G. Marsden, G. Lindgaard, J. Wesson, and M. Winckler (Berlin, Heidelberg: Springer Berlin Heidelberg), Vol. 8117, 167–184. doi:10.1007/978-3-642-40483-2_12
Cleveland, W. S., and McGill, R. (1987). Graphical Perception: The Visual Decoding of Quantitative Information on Graphical Displays of Data. J. R. Statistical Society. Series A (General) 150, 192. doi:10.2307/2981473
Cummings, J. J., and Bailenson, J. N. (2016). How Immersive Is Enough? A Meta-Analysis of the Effect of Immersive Technology on User Presence. Media Psychol. 19, 272–309. doi:10.1080/15213269.2015.1015740
Das, R., Keep, B., Washington, P., and Riedel-Kruse, I. H. (2019). Scientific Discovery Games for Biomedical Research. Annu. Rev. Biomed. Data Sci. 2, 253–279. doi:10.1146/annurev-biodatasci-072018-021139
de Faria, J. W., Teixeira, M. J., de Moura Sousa Júnior, L., Otoch, J. P., and Figueiredo, E. G. (2016). Virtual and Stereoscopic Anatomy: when Virtual Reality Meets Medical Education. J. Neurosurg. 125, 1105–1111. doi:10.3171/2015.8.JNS141563
Doak, D. G., Denyer, G. S., Gerrard, J. A., Mackay, J. P., and Allison, J. R. (2020). Peppy: A Virtual Reality Environment for Exploring the Principles of Polypeptide Structure. Protein Sci. 29, 157–168. doi:10.1002/pro.3752
Eiben, C. B., Siegel, J. B., Bale, J. B., Cooper, S., Khatib, F., Shen, B. W., et al. (2012). Increased Diels-Alderase Activity through Backbone Remodeling Guided by Foldit Players. Nat. Biotechnol. 30, 190–192. doi:10.1038/nbt.2109
Ellis, S. R., Tharp, G. K., Grunwald, A. J., and Smith, S. (1991). Exocentric Judgements in Real Environments and Stereoscopic Displays. Proc. Hum. Factors Soc. Annu. Meet. 35, 1442–1446. doi:10.1177/154193129103502005
Etemadpour, R., Monson, E., and Linsen, L. (2013). “The Effect of Stereoscopic Immersive Environments on Projection-Based Multi-Dimensional Data Visualization,” in 2013 17th International Conference on Information Visualisation, London, United Kingdom, July 16–18, 2013 (IEEE), 389–397. doi:10.1109/IV.2013.51
Eterna (2022). Eterna Solve Puzzles Invent Medicine. Available at: https://eternagame.org (Accessed February 4, 2022).
EyeWire (2022). EyeWire A Game to Map the Brain. Available at: https://eyewire.org (Accessed February 4, 2022).
EyeWire Into the Brain (2022). Into the Brain about EyeWire. Available at: https://science.eyewire.org/about (Accessed February 4, 2022).
Farahani, N., Post, R., Duboy, J., Ahmed, I., Kolowitz, B. J., Krinchai, T., et al. (2016). Exploring Virtual Reality Technology and the Oculus Rift for the Examination of Digital Pathology Slides. J. Pathol. Inform. 7, 22. doi:10.4103/2153-3539.181766
Foldit (2022). Foldit Solve Puzzles for Science. Available at: https://fold.it (Accessed February 4, 2022).
Garcia-Bonete, M. J., Jensen, M., and Katona, G. (2019). A Practical Guide to Developing Virtual and Augmented Reality Exercises for Teaching Structural Biology. Biochem. Mol. Biol. Educ. 47, 16–24. doi:10.1002/bmb.21188
Gillette, M. A., Satpathy, S., Cao, S., Dhanasekaran, S. M., Vasaikar, S. V., Krug, K., et al. (2020). Proteogenomic Characterization Reveals Therapeutic Vulnerabilities in Lung Adenocarcinoma. Cell 182, 200–e35. e35. doi:10.1016/j.cell.2020.06.013
Greffard, N., Picarougne, F., and Kuntz, P. (2014). “Beyond the Classical Monoscopic 3D in Graph Analytics: An Experimental Study of the Impact of Stereoscopy,” in 2014 IEEE VIS International Workshop on 3DVis (3DVis), Paris, France, November 9, 2014 (IEEE), 19–24. doi:10.1109/3DVis.2014.7160095
Horowitz, S., Koepnick, B., Martin, R., Tymieniecki, A., Winburn, A. A., Cooper, S., et al. (2016). Determining crystal Structures through Crowdsourcing and Coursework. Nat. Commun. 7, 12549. doi:10.1038/ncomms12549
Kalayci, S., Petralia, F., Wang, P., and Gümüş, Z. H. (2020). ProNetView-ccRCC: A Web-Based Portal to Interactively Explore Clear Cell Renal Cell Carcinoma Proteogenomics Networks. Proteomics 20, e2000043. doi:10.1002/pmic.202000043
Khatib, F., DiMaio, F., Cooper, S., Kazmierczyk, M., Cooper, S., Kazmierczyk, M., et al. (2011). Crystal Structure of a Monomeric Retroviral Protease Solved by Protein Folding Game Players. Nat. Struct. Mol. Biol. 18, 1175–1177. doi:10.1038/nsmb.2119
Kingsley, L. J., Brunet, V., Lelais, G., McCloskey, S., Milliken, K., Leija, E., et al. (2019). Development of a Virtual Reality Platform for Effective Communication of Structural Data in Drug Discovery. J. Mol. Graph Model. 89, 234–241. doi:10.1016/j.jmgm.2019.03.010
Kockro, R. A., Amaxopoulou, C., Killeen, T., Wagner, W., Reisch, R., Schwandt, E., et al. (2015). Stereoscopic Neuroanatomy Lectures Using a Three-Dimensional Virtual Reality Environment. Ann. Anat. 201, 91–98. doi:10.1016/j.aanat.2015.05.006
Kockro, R. A., Stadie, A., Schwandt, E., Reisch, R., Charalampaki, C., Ng, I., et al. (2007). A Collaborative Virtual Reality Environment for Neurosurgical Planning and Training. Neurosurgery 61, 379–391. doi:10.1227/01.neu.0000303997.12645.26
Kwon, O.-H., Muelder, C., Lee, K., and Ma, K.-L. (2015). “Spherical Layout and Rendering Methods for Immersive Graph Visualization,” in 2015 IEEE Pacific Visualization Symposium (PacificVis), Hangzhou, China, April 14–17, 2015 (IEEE), 63–67. doi:10.1109/PACIFICVIS.2015.7156357
Kwon, O. H., Muelder, C., Lee, K., and Ma, K. L. (2016). A Study of Layout, Rendering, and Interaction Methods for Immersive Graph Visualization. IEEE Trans. Vis. Comput. Graph 22, 1802–1815. doi:10.1109/TVCG.2016.2520921
Lee, L.-H., Braud, T., Zhou, P., Wang, L., Xu, D., Lin, Z., et al. (2021). All One Needs to Know about Metaverse: A Complete Survey on Technological Singularity, Virtual Ecosystem, and Research Agenda. [Preprint] https://arxiv.org/abs/2110.05352.
Legetth, O., Rodhe, J., Lang, S., Dhapola, P., Wallergård, M., and Soneji, S. (2021). CellexalVR: A Virtual Reality Platform to Visualize and Analyze Single-Cell Omics Data. iScience 24, 103251. doi:10.1016/j.isci.2021.103251
Leinen, P., Green, M. F., Esat, T., Wagner, C., Tautz, F. S., and Temirov, R. (2015). Virtual Reality Visual Feedback for Hand-Controlled Scanning Probe Microscopy Manipulation of Single Molecules. Beilstein J. Nanotechnol 6, 2148–2153. doi:10.3762/bjnano.6.220
Li, J., Nie, L., Li, Z., Lin, L., Tang, L., and Ouyang, J. (2012). Maximizing Modern Distribution of Complex Anatomical Spatial Information: 3D Reconstruction and Rapid Prototype Production of Anatomical Corrosion Casts of Human Specimens. Anat. Sci. Educ. 5, 330–339. doi:10.1002/ase.1287
Liimatainen, K., Latonen, L., Valkonen, M., Kartasalo, K., and Ruusuvuori, P. (2021). Virtual Reality for 3D Histology: Multi-Scale Visualization of Organs with Interactive Feature Exploration. BMC cancer 21, 1133. doi:10.1186/s12885-021-08542-9
Liluashvili, V., Kalayci, S., Fluder, E., Wilson, M., Gabow, A., and Gümüs, Z. H. (2017). iCAVE: an Open Source Tool for Visualizing Biomolecular Networks in 3D, Stereoscopic 3D and Immersive 3D. GigaScience 6, 1–13. doi:10.1093/gigascience/gix054
McIntire, J. P., and Liggett, K. K. (2014). “The (Possible) Utility of Stereoscopic 3D Displays for Information Visualization: The Good, the Bad, and the Ugly,” in 2014 IEEE VIS International Workshop on 3DVis (3DVis), Paris, France, November 9, 2014 (IEEE), 1–9. doi:10.1109/3DVis.2014.7160093
McKendrick, R., Parasuraman, R., Murtza, R., Formwalt, A., Baccus, W., Paczynski, M., et al. (2016). Into the Wild: Neuroergonomic Differentiation of Hand-Held and Augmented Reality Wearable Displays during Outdoor Navigation with Functional Near Infrared Spectroscopy. Front. Hum. Neurosci. 10, 216. doi:10.3389/fnhum.2016.00216
Mirhosseini, K., Sun, Q., Gurijala, K. C., Laha, B., and Kaufman, A. E. (2014). “Benefits of 3D Immersion for Virtual Colonoscopy,” in IEEE VIS International Workshop on 3DVis (3DVis) (IEEE), 75–79. doi:10.1109/3DVis.2014.7160105
Nakano, C. M., Moen, E., Byun, H. S., Ma, H., Newman, B., McDowell, A., et al. (2016). iBET: Immersive Visualization of Biological Electron-Transfer Dynamics. J. Mol. Graph Model. 65, 94–99. doi:10.1016/j.jmgm.2016.02.009
Open Hybrid (2022). Open Hybrid. Available at: http://openhybrid.org (Accessed February 4, 2022).
Petralia, F., Tignor, N., Reva, B., Koptyra, M., Chowdhury, S., Rykunov, D., et al. (2020). Integrated Proteogenomic Characterization across Major Histological Types of Pediatric Brain Cancer. Cell 183, 1962–1985. e31. doi:10.1016/j.cell.2020.10.044
Ratamero, E. M., Bellini, D., Dowson, C. G., and Römer, R. A. (2018). Touching Proteins with Virtual Bare Hands : Visualizing Protein-Drug Complexes and Their Dynamics in Self-Made Virtual Reality Using Gaming Hardware. J. Comput. Aided Mol. Des. 32, 703–709. doi:10.1007/s10822-018-0123-0
Ravassard, P., Kees, A., Willers, B., Ho, D., Aharoni, D. A., Cushman, J., et al. (2013). Multisensory Control of Hippocampal Spatiotemporal Selectivity. Science 340, 1342–1346. doi:10.1126/science.1232655
Rosen, J. M., Kun, L., Mosher, R. E., Grigg, E., Merrell, R. C., Macedonia, C., et al. (2016). Cybercare 2.0: Meeting the challenge of the Global burden of Disease in 2030. Health Technol. (Berl) 6, 35–51. doi:10.1007/s12553-016-0132-8
Sabir, K., Stolte, C., Tabor, B., and O'Donoghue, S. I. (2013). “The Molecular Control Toolkit: Controlling 3D Molecular Graphics via Gesture and Voice,” in 2013 IEEE Symposium on Biological Data Visualization (BioVis), Atlanta, GA, October 13–14, 2013 (IEEE), 49–56. doi:10.1109/BioVis.2013.6664346
Salzman, M. C., Dede, C., Loftin, R. B., and Chen, J. (1999). A Model for Understanding How Virtual Reality Aids Complex Conceptual Learning. Presence: Teleoperators & Virtual Environments 8, 293–316. doi:10.1162/105474699566242
Satpathy, S., Krug, K., Jean Beltran, P. M., Savage, S. R., Petralia, F., Kumar-Sinha, C., et al. (2021). A Proteogenomic Portrait of Lung Squamous Cell Carcinoma. Cell 184, 4348–e40. e40. doi:10.1016/j.cell.2021.07.016
Schubert, T., Friedmann, F., and Regenbrecht, H. (2001). The Experience of Presence: Factor Analytic Insights. Presence: Teleoperators & Virtual Environments 10, 266–281. doi:10.1162/105474601300343603
Shannon, R. J., Deeks, H. M., Burfoot, E., Clark, E., Jones, A. J., Mulholland, A. J., et al. (2021). Exploring Human-Guided Strategies for Reaction Network Exploration: Interactive Molecular Dynamics in Virtual Reality as a Tool for Citizen Scientists. J. Chem. Phys. 155, 154106. doi:10.1063/5.0062517
Simpson, R. M., LaViola, J. J., Laidlaw, D. H., Forsberg, A. S., and van Dam, A. (2000). Immersive VR for Scientific Visualization: a Progress Report. IEEE Comput. Grap. Appl. 20, 26–52. doi:10.1109/38.888006
Slater, M., Linakis, V., Usoh, M., and Kooper, R. (1996). “Immersion, Presence and Performance in Virtual Environments,” in Proceedings of the ACM Symposium on Virtual Reality Software and Technology - VRST ’96 (New York, USA: ACM Press), 163–172. doi:10.1145/3304181.3304216
Slater, M., and Wilbur, S. (1997). A Framework for Immersive Virtual Environments (FIVE): Speculations on the Role of Presence in Virtual Environments. Presence: Teleoperators & Virtual Environments 6, 603–616. doi:10.1162/pres.19126.96.36.1993
Spark, A., Kitching, A., Esteban-Ferrer, D., Handa, A., Carr, A. R., Needham, L. M., et al. (2020). vLUME: 3D Virtual Reality for Single-Molecule Localization Microscopy. Nat. Methods 17, 1097–1099. doi:10.1038/s41592-020-0962-1
Stein, D. F., Chen, H., Vinyard, M. E., Qin, Q., Combs, R. D., Zhang, Q., et al. (2021). singlecellVR: Interactive Visualization of Single-Cell Data in Virtual Reality. Front. Genet. 12, 764170. doi:10.3389/fgene.2021.764170
Sutherland, I. E. (1968). “A Head-Mounted Three Dimensional Display,” in Proceedings of the December 9-11, 1968, fall joint computer conference, part I on - AFIPS ’68 (Fall, part I) (New York, USA: ACM Press), 757. doi:10.1145/1476589.1476686
Tse, C.-M., Li, H., Leung, K.-S., Lee, K.-H., and Wong, M.-H. (2011). “Interactive Drug Design in Virtual Reality,” in 2011 15th International Conference on Information Visualisation, London, United Kingdom, July 13–15, 2011 (IEEE), 226–231. doi:10.1109/IV.2011.72
Wang, K., Yap, L. W., Gong, S., Wang, R., Wang, S. J., and Cheng, W. (2021). Nanowire‐Based Soft Wearable Human-Machine Interfaces for Future Virtual and Augmented Reality Applications. Adv. Funct. Mater. 31, 2008347. doi:10.1002/adfm.202008347
Weng, C., Otanga, S., Christianto, S. M., and Chu, R. J.-C. (2020). Enhancing Students' Biology Learning by Using Augmented Reality as a Learning Supplement. J. Educ. Comput. Res. 58, 747–770. doi:10.1177/0735633119884213
Wirth, W., Hartmann, T., Böcking, S., Vorderer, P., Klimmt, C., Schramm, H., et al. (2007). A Process Model of the Formation of Spatial Presence Experiences. Media Psychol. 9, 493–525. doi:10.1080/15213260701283079
Yallapragada, V. V. B., Xu, T., Walker, S. P., Tabirca, S., and Tangney, M. (2021). Pepblock Builder VR - an Open-Source Tool for Gaming-Based Bio-Edutainment in Interactive Protein Design. Front. Bioeng. Biotechnol. 9, 674211. doi:10.3389/fbioe.2021.674211
Keywords: virtual reality, augmented reality, visualization design, 3D, immersive 3D, multi-omics visualization, CAVE, systems biology
Citation: Turhan B and Gümüş ZH (2022) A Brave New World: Virtual Reality and Augmented Reality in Systems Biology. Front. Bioinform. 2:873478. doi: 10.3389/fbinf.2022.873478
Received: 10 February 2022; Accepted: 02 March 2022;
Published: 06 April 2022.
Edited by:Sean O’Donoghue, Garvan Institute of Medical Research, Australia
Reviewed by:Jan Egger, University Hospital Essen, Germany
Copyright © 2022 Turhan and Gümüş. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Zeynep H. Gümüş, firstname.lastname@example.org