Skip to main content

PERSPECTIVE article

Front. Bioinform., 06 April 2022
Sec. Data Visualization
Volume 2 - 2022 | https://doi.org/10.3389/fbinf.2022.873478

A Brave New World: Virtual Reality and Augmented Reality in Systems Biology

  • 1Department of Genetics and Genomic Sciences, Icahn School of Medicine at Mount Sinai, New York, NY, United States
  • 2Faculty of Natural Sciences and Engineering, Sabancı University, Istanbul, Turkey
  • 3Precision Immunology Institute, Icahn School of Medicine at Mount Sinai, New York, NY, United States

How we interact with computer graphics has not changed significantly from viewing 2D text and images on a flatscreen since their invention. Yet, recent advances in computing technology, internetworked devices and gaming are driving the design and development of new ideas in other modes of human-computer interfaces (HCIs). Virtual Reality (VR) technology uses computers and HCIs to create the feeling of immersion in a three-dimensional (3D) environment that contains interactive objects with a sense of spatial presence, where objects have a spatial location relative to, and independent of the users. While this virtual environment does not necessarily match the real world, by creating the illusion of reality, it helps users leverage the full range of human sensory capabilities. Similarly, Augmented Reality (AR), superimposes virtual images to the real world. Because humans learn the physical world through a gradual sensory familiarization, these immersive visualizations enable gaining familiarity with biological systems not realizable in the physical world (e.g., allosteric regulatory networks within a protein or biomolecular pathways inside a cell). As VR/AR interfaces are anticipated to be explosive in consumer markets, systems biologists will be more immersed into their world. Here we introduce a brief history of VR/AR, their current roles in systems biology, and advantages and disadvantages in augmenting user abilities. We next argue that in systems biology, VR/AR technologies will be most useful in visually exploring and communicating data; performing virtual experiments; and education/teaching. Finally, we discuss our perspective on future directions for VR/AR in systems biology.

Background

We see the world in three dimensions (3D), because we have binocular vision, meaning our left and right eyes see slightly different views of an object-a topic explored since Euclid’s 3rd century BC Optics and physician Galen’s 2nd century AD On the Use of the Different Parts of the Human Body, to da Vinci’s (1452–1519) Trattato della Pittura (Art of Painting). The first device that produced 3D-effects utilized binocularity by using two mirrors centered at 45° reflecting specific images to each eye. Invented in the 1830s by Charles Wheatstone, it was called the reflecting mirror stereoscope, from the Greek skopion and stereo (see solid). Conceptually, Virtual Reality (VR) was first described by computing pioneer Sutherland (Sutherland, 1965), who created arguably the first VR head-mounted display (HMD)—a large and bulky device that required mounting to the ceiling and could cause severe injury if it fell on a user, which was nicknamed the Sword of Damocles (Sutherland, 1968). However, the actual term Virtual Reality was coined later in the 1980s, when Visual Programming Lab of computer scientist Jaron Lanier began producing the first commercially available VR headsets and gloves. Since then, efforts to integrate the human body naturally into the virtual experience have driven significant advances in VR and Augmented Reality (AR) human-computer interfaces (HCIs), which mainly consist of input devices, output displays, and various hardware and software parts. We summarize these technologies in Table 1.

TABLE 1
www.frontiersin.org

TABLE 1. Input tracking and output display technologies in consumer-level VR/AR systems.

Notable within these efforts, CAVE Automatic Virtual Environments (CAVEs) were introduced in the 90s (Cruz-Neira et al., 1992) by projecting stereoscopic images on walls and floor of a room-sized cube. A CAVE can be thought of like Star Trek’s Holodeck (Chu and Quek, 2013), though the name refers to the metaphor of Plato’s cave in the Republic (Plato and Shorey, 1930) where a philosopher contemplates perception, reality, and illusion. Inside a CAVE, users wear Liquid-Crystal Display (LCD) shutter glasses and a head-tracker and interact with objects using a wand-like device to gain immersion that utilizes a full range of human vision with much wider field-of-view (FOV) and enhanced perceptive depth and shape perception. Later on, for massive data, large, ultra-high-resolution matrices of multiple displays (either monitors or projectors) called Powerwalls were also developed (Papadopoulos et al., 2015). Powerwalls and CAVEs are attractive as collaborative environments in which several investigators can simultaneously interact with VR objects. Yet, they are expensive to build, maintain and upgrade, occupy considerable space, and require many displays and massive processing power, making them cost-prohibitive. Thus, with these available technologies VR was not commercially popular to go mainstream (Wohlgenannt et al., 2020).

Until recently, VR was even considered a “dead technology” (Slater and Sanchez-Vives, 2016). However, recent technological advances, especially in gaming products including Oculus Rift (Menlo Park, CA), HTC VIVE (Taiwan), and PlayStation VR (San Mateo, CA), have enabled VR to finally have good enough performance at relatively cheaper prices, creating a positive feedback loop between companies that develop more advanced technologies and expanding consumer demand, which led to a tremendous jump in VR technology (Wohlgenannt et al., 2020; Kugler, 2021). For context, commercial VR systems in 2016 required expensive and difficult setups including an HMD headset, controllers, and sensors connected to an external high-graphics computer. However, some current new generation VR sets such as Oculus Quest 2 (Menlo Park, CA) and HTC VIVE Focus series (Taiwan), generally referred as stand-alone VR systems, or all-in-one solutions, do not even depend on any external computer system (Wohlgenannt et al., 2020), increasing accessibility further. Coupled with very recent developments such as the popular metaverse concept that combines multiple elements of VR/AR and internet technologies to achieve an extended reality by blending physical and digital worlds together (Lee et al., 2021), we anticipate that VR/AR technologies will finally go mainstream.

VR/AR in Systems Biology

In systems biology, we often seek to provide new insights that weave data on molecules, pathways, cells and tissues to whole organisms, populations, and ecosystems in multiple time- and length-scales. Furthermore, new high-throughput–omics experimental techniques are producing massive and diverse multi-omics datasets (Gillette et al., 2020; Kalayci et al., 2020; Petralia et al., 2020; Marx, 2021; Satpathy et al., 2021), and detailed data capture is boosted further by advances in supercomputers and tracking sensor technologies. Parallel increases in processor speeds and data storage enable computational analyses of these data (Schadt et al., 2010). New or updated visualization technologies are needed to explore and communicate these datasets. While not new in scientific visualizations (Bryson, 1996; Simpson et al., 2000), VR technologies offer novel avenues to these unprecedented new data communication needs. However, much remains to be answered on when and how to use VR/AR technologies in systems biology.

To help answer these questions for different use cases, we first provide a summary of the advantages and disadvantages of using VR technologies in Table 2. Notably, their greatest advantage is in providing unparalleled presence—the sense of being inside of and interacting with the virtual environment (Slater and Wilbur, 1997; Schubert et al., 2001; Sutcliffe et al., 2005). Therefore, we recommend utilizing VR/AR technologies when spatial presence and immersive interactivity with the content makes a difference in addressing user needs. User comprehension may become somewhat limited in 2D above domain-specific data type and size thresholds, leading to data occlusions. In such cases, immersion enables 3D-navigation and provides the necessary perceptive depth to enhance comprehension. For example, for multi-dimensional data, user studies have reported significantly better performance for immersive 3D vs. 2D environments for certain analysis tasks (Etemadpour et al., 2013). As researchers working at the interface of biology and data visualization, based on our own experience in this domain, and observations on the general trends in VR applications in biology that we briefly summarize here, we anticipate their utilization in three main areas in systems biology: 1) visually exploring and communicating data; 2) performing virtual experiments; and 3) education/teaching and discuss them below.

TABLE 2
www.frontiersin.org

TABLE 2. Advantages and Disadvantages of VR technologies.

Visually Exploring and Communicating Data

Systems biology visualizations are generally abstract representations, far from real-world objects. For instance, cellular pathway graphs are often cartoon representations. Yet, they suffice in helping understand the biological phenomena they represent. In VR environments, users interpret such abstractions as real objects, and the imagery has a lasting impact on our brains. Furthermore, we can interact with the virtual objects in ways that are not possible in the real world at multiple scales, from single molecules (Leinen et al., 2015), protein-drug complexes (Norrby et al., 2015) biomolecular networks (Liluashvili et al., 2017) to organs (Mirhosseini et al., 2014); navigate through them (Bellmund et al., 2016), or dynamics (Nakano et al., 2016). VR technologies thus open the door to benefit diverse phenomena that involve 3D spatial reconstructions, from localizations and dynamics in the human brain (Calì et al., 2016) to interactions in retinal pathology (Aaker et al., 2011) and volumetric studies in digital pathology (Farahani et al., 2016; Liimatainen et al., 2021). More recently, single-molecule localization microscopy (SMLM) in immersive VR has been used to visualize biological structures as point clouds at the molecular scale (e.g., vLUME by Spark et al., 2020). Some of these recent SMLM tools can be further extended to other similarly multidimensional spatially localized point clouds (e.g., Genuage tool by Blanc et al., 2020).

In systems biology, the knowledge, visualization, and exploration of related 3D structural data often play an important role. To explore and understand the function of macromolecules (proteins, RNA, and DNA) and their complexes from their 3D structural models, molecular graphics is shifting to VR (Tse et al., 2011; Nakano et al., 2016; Ratamero et al., 2018; Cassidy et al., 2020; Todd and Emsley, 2021). Some of these applications further integrate relevant structural knowledge with domain-specific (Norrby et al., 2015), or genomics datasets (Zhang et al., 2019). While some molecular viewers work in CAVE environments (Block et al., 2009), others utilize latest HMDs (Leinen et al., 2015), game engines such as Unity and gesture devices such as Kinect and Leap Motion (Probst and Reymond, 2018; Zhang et al., 2019) or even voice recognition (Sabir et al., 2013) to activate commands; or provide web-based VR without head-tracking or advanced interactions (Li et al., 2014; Cassidy et al., 2020). Parallel efforts are also on-going to study the chemical fingerprints of DrugBank compounds in VR environments (Probst and Reymond, 2018).

VR is useful even when exploring abstract systems-level data that do not contain 3D-localizations. For example, researchers often employ complex network visualizations which may require many viewpoints due to clutter, and navigation issues. Several studies suggest stereoscopy alone (Ware and Mitchell, 2008; Greffard et al., 2014; Kwon et al., 2015; Kwon et al., 2016) or combined with rotation (Sollenberger and Milgram, 1993) or motion cues (Ware and Mitchell, 2008) enhances performance in comprehension, helps present graphs better than 2D-displays (Sollenberger and Milgram, 1993) and enables low user error rates (Ware and Mitchell, 2008). For example, Supplementary Figure S1 shows a relatively large network in 2D, and Supplementary Video S1 in 3D. While both are generated using the network visualization tool iCAVE (Liluashvili et al., 2017; Kalayci and Gümüş, 2018), within iCAVE users can further interactively explore the 3D or stereo versions from multiple perspectives. Visualizations from multiple perspectives reportedly make different aspects of a system more salient (Ellis et al., 1991). Similarly, the BigTop tool (Westreich et al., 2020) renders Manhattan plots from genome-wide association studies (GWAS) in 3D.

More recently, VR is used to explore multidimensional -omics datasets in systems biology including cytometry, transcriptomics, epigenomics, proteomics and their combinations, in the form of abstract data clouds. For example, single-cell RNA sequencing data analysis often includes a dimensionality reduction step, where cell populations are projected in 2D or 3D space to explore cellular heterogeneity. Visualizing such datasets in 3D can be more informative as it decreases the possibility of collapsing similar cell types and clusters. A recent tool, CellexalVR, allows visual exploration and analysis of such dimensionality reduction plots and associated metadata in immersive VR (Legetth et al., 2021). Other tools for the same purpose include starmapVR (Yang et al., 2020), singlecellVR (Stein et al., 2021) and Theia (Bressan et al., 2021). These platforms often include additional modalities such as on-the-fly clustering, or visualization of dynamic changes in RNA velocity. While singlecellVR and starmapVR are web applications that enable visualizations using low-cost and easily available VR hardware such as Google Cardboard (Yang et al., 2020; Stein et al., 2021), CellexalVR involves GPU-accelerated performance and in-session on-the-fly calculations. StarmapVR further enables simultaneous visualization of spatial transcriptomics data from matching histological images. We anticipate that in the near future we will witness further developments in VR tools for the visual exploration of spatial transcriptomics datasets.

Performing Virtual Experiments

For research studies that cannot be performed in the real world, VR provides a safe, standardized, and reproducible environment that is as life-like as possible (Tarr and Warren, 2002). In addition, we can break the laws of optics and physics, or disconnect real life sensed by the user’s body from the world the user is experiencing (Tarr and Warren, 2002). Researchers have been using such VR properties to study, modify or enhance behavior. For example, neural processes research that links biology and behavior in different species, from insects to humans (Ravassard et al., 2013; Aghajan et al., 2015; Acharya et al., 2016) has used VR to help understand sensory cues that carry information on the virtual worlds or refine the rules that link a subject’s actions to changes in their world. In addition to assisting in understanding human behavior, research can in turn inform how VR environments can be improved in design for increased human engagement. For example, studies suggest that the socially networked nature of VR should be considered in tool design, as numbers of remote users in virtual spaces increase (Kruzan and Won, 2019; Jeong et al., 2021).

In systems biology, virtual experiments can help develop and improve scientific thinking skills. Virtual reformulations of experiments in the form of games to be solved have already proven efficiency in tackling scientific problems. For example, the tools Foldit (Foldit, 2022) and Eterna (Eterna, 2022) have gamified the protein folding and RNA structure prediction problems (Das et al., 2019), and thereby enabled more individuals to perform virtual experiments by engaging the online gaming community whose members may have little to no scientific knowledge. Yet, these gamers have successfully solved real-world problems in relatively short time scales (Cooper et al., 2010; Khatib et al., 2011; Eiben et al., 2012; Horowitz et al., 2016; Koepnick et al., 2019). Similarly, the tool EyeWire has gamified mapping neural circuits in the brain to understand vision, where players try to virtually map 3D neuron structures to serial electron microscopy image data from animal brains (Das et al., 2019; EyeWire, 2022). This game has so far attracted more than 150 K gamers who helped reveal 6 new neuron types and many undiscovered brain circuits (EyeWire Into the Brain, 2022). VR environments open exciting new possibilities of such gamification of virtual scientific problems both for scientists with little coding experience, as well as for non-scientists, and the communication between these two communities. We have already started to observe the first examples of such tools in computational chemistry, where Shannon et al have introduced molecular dynamics VR game to encourage users to explore the reactivity of a specific chemical system (Shannon et al., 2021), and an intuitive VR platform called Nanome presented by Kingsley et al., where the users explore and modify chemical structures collaboratively to work on structural biology and molecular drug design problems (Kingsley et al., 2019). The next several years will witness an increasing number of tools that gamify virtual experiments in the VR environments. Note that in online gaming communities, VR is increasingly blended with social media functionalities, and thus gamified systems biology VR tools will likely need to consider such additional functionalities that are critical for remote users.

Education/Teaching

VR environments can help learning in systems biology areas that involve complex 3D information (Salzman et al., 1999; Mikropoulos and Natsis, 2011), user-interactivity and/or high computational skills. For example, understanding protein structure traditionally involves physical modeling kits or projections of the 3D structures into 2D. However, the ability to create, alter, and rotate a chemical structure in real time in 3D can make it easier to understand abstract concepts (Limniou et al., 2008). Similarly, annotated 3D web-based anatomy atlases help teach complex structures such as artery networks or bronchial trees. Earlier interactive 3D-renderings of these systems used desktops with standard screens due to the costs of VR processors and displays (Li et al., 2012), while later technologies have enabled stereoscopic immersive 3D with real-time interactivity (Kockro et al., 2007). Randomized user studies show that stereo-enhanced 3D-tools are useful in learning anatomy and are well-received by students (Kockro et al., 2015; de Faria et al., 2016). Similarly, integrating AR technology reportedly has positive impact on student learning in biology (Weng et al., 2020).

Recent technological advances in VR have substantially increased their potential utility in learning. Biological concepts currently constitute ∼5% of academic publications on VR (Morimoto and Ponton, 2021). Relatively popular educational platforms include those that simulate biological and chemical experiments within VR environments, such as VRLab Academy (United Kingdom), Labster (Denmark), and ClassVR (United Kingdom). Advances in gaming have expanded VR applications in education as well. For example, the tool Peppy provides a Unity-based VR gaming engine to understand protein structures and their dynamics in undergraduate biochemistry classes (Doak et al., 2020). Similarly, Pepblock Builder VR tool provides a gamified interface for researchers who are not advanced in the computational skills required for protein design (Yallapragada et al., 2021). Many educational VR experiences exist in popular gaming platforms that recreate biological systems, such as The Body VR, where players move in the bloodstream to observe human cells and learn how organelles function (The Body VR LLC, 2016), and InCell VR, where players fight to stop a virus invasion in human, while learning about cell and organelle microstructures (Luden.io, 2015). We anticipate that gaming-based VR tools will similarly be developed for learning multi-omics datasets at a systems level. Further research will then be necessary to understand the full potential impact of VR in learning systems biology.

Discussion

The recent explosion in VR/AR technologies has coincided with extended work-from-home practices due to the coronavirus disease 2019 global pandemic. These developments have lowered resistance to virtual technologies and in fact created a pressing need for virtual, collaborative workspaces in research. Coupled with the explosive increase in datasets collected from multi-omics high-throughput experiments, VR/AR technologies offer attractive new opportunities for visual data exploration and communication in systems biology. However, to develop the most useful tools, systems biologists need to conduct their own user studies and get familiar with design practices within virtual spaces for improved human perception (Cleveland and McGill, 1987). With deeper understandings of the brain and visual perception, content creation and best practices will be established, and adoption will increase. Further technological improvements (higher frame rates, efficient information storage and rendering; increased data transfer with less bit rates; game engines; graphics cards) will aid challenging visualizations such as dynamic networks or multi-scale systems, integrated with data annotations and on-the-fly calculations. Visualization outcomes will in turn guide future research protocols.

VR application development is already easier with HMDs and input devices that offer game engine-compatible free software development kits. These render information from internetworked devices that collect and exchange data with sensors and network connections (Rose, 2014; Akyildiz et al., 2015; Open Hybrid, 2022) or from integrating multiple technologies (e.g., healthcare in cyberspace (Rosen et al., 2016)). We anticipate that new technologies will further eliminate discomforts and limitations of modern VR headsets such as their bulkiness and weight. Towards this end, current studies include using skin sensory interfaces, such as nanowire-based soft wearable HMIs (Wang et al., 2021), and thin and light-weight holographic optics with high performance sunglasses-like near-eye full-color displays, as developed by Facebook Reality Labs (Maimone and Wang, 2020). Such new technologies may remove the barriers between the virtual and real worlds further by eliminating headset use, thereby converging VR/AR (Kugler, 2021).

In summary, we are currently at a critical juncture for VR/AR use in systems biology, as they are finally on the verge of going mainstream. We anticipate that the current trends towards utilizing low-cost VR/AR systems will continue. Still, for certain research areas, interactive 3D applications in Web3D will likely suffice. For some applications, AR will be preferable, as it allows users to overlap virtual data onto the real world in relatively simpler set-ups such as smartphones, without the need of HMDs, and provide more control of their surroundings (Garcia-Bonete et al., 2019). VR will likely remain advantageous in applications that require better immersion and realism (Garcia-Bonete et al., 2019). Barriers for access to VR/AR visualizations in systems biology will likely remain, however, for researchers from underdeveloped countries, and which will need to be addressed. At the same time, with the increasing trends in gamification within the VR environments (Shannon et al., 2021), barriers for scientists with low computational expertise and non-scientists in conducting their own virtual experiments will decrease. As user community grows and commercial VR/AR technologies expand, we expect the range of their systems biology applications will also continue to grow, opening possibilities for significant advancements in understanding and communicating disease-associated mechanisms, running virtual experiments, and education, and help boost the development of new therapies. Of course, the best way to gauge possibilities is to explore them!

Data Availability Statement

The original contributions presented in the study are included in the article/Supplementary Material, further inquiries can be directed to the corresponding author.

Author Contributions

ZG conceptualized the study, ZG and BT wrote the original draft, edited, and revised the manuscript, ZG was responsible for supervision, project administration, and funding acquisition.

Funding

This study was supported by the Concern Foundation Conquer Cancer Now award and Cancer Moonshot R33 award # CA263705-01 to ZG.

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher’s Note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

Supplementary Material

The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/fbinf.2022.873478/full#supplementary-material

References

Aaker, G. D., Gracia, L., Myung, J. S., Borcherding, V., Banfelder, J. R., D'Amico, D. J., et al. (2011). Volumetric Three-Dimensional Reconstruction and Segmentation of Spectral-Domain OCT. Ophthalmic Surg. Lasers Imaging 42, S116–S120. doi:10.3928/15428877-20110627-11

PubMed Abstract | CrossRef Full Text | Google Scholar

Acharya, L., Aghajan, Z. M., Vuong, C., Moore, J. J., and Mehta, M. R. (2016). Causal Influence of Visual Cues on Hippocampal Directional Selectivity. Cell 164, 197–207. doi:10.1016/j.cell.2015.12.015

PubMed Abstract | CrossRef Full Text | Google Scholar

Aghajan, Z. M., Acharya, L., Moore, J. J., Cushman, J. D., Vuong, C., and Mehta, M. R. (2015). Impaired Spatial Selectivity and Intact Phase Precession in Two-Dimensional Virtual Reality. Nat. Neurosci. 18, 121–128. doi:10.1038/nn.3884

PubMed Abstract | CrossRef Full Text | Google Scholar

Akyildiz, I., Pierobon, M., Balasubramaniam, S., and Koucheryavy, Y. (2015). The Internet of Bio-Nano Things. IEEE Commun. Mag. 53, 32–40. doi:10.1109/MCOM.2015.7060516

CrossRef Full Text | Google Scholar

Balakrishnan, B., and Sundar, S. S. (2011). Where Am I? How Can I Get There? Impact of Navigability and Narrative Transportation on Spatial Presence. Human–Computer Interaction 26, 161–204. doi:10.1080/07370024.2011.601689

CrossRef Full Text | Google Scholar

Bellmund, J. L., Deuker, L., Navarro Schröder, T., and Doeller, C. F. (2016). Grid-cell Representations in Mental Simulation. eLife 5, e17089. doi:10.7554/eLife.17089

PubMed Abstract | CrossRef Full Text | Google Scholar

Blanc, T., el Beheiry, M., Caporal, C., Masson, J. B., and Hajj, B. (2020). Genuage: Visualize and Analyze Multidimensional Single-Molecule point Cloud Data in Virtual Reality. Nat. Methods 17, 1100–1102. doi:10.1038/s41592-020-0946-1

PubMed Abstract | CrossRef Full Text | Google Scholar

Block, J. N., Zielinski, D. J., Chen, V. B., Davis, I. W., Vinson, E. C., Brady, R., et al. (2009). KinImmerse: Macromolecular VR for NMR Ensembles. Source Code Biol. Med. 4, 3. doi:10.1186/1751-0473-4-3

PubMed Abstract | CrossRef Full Text | Google Scholar

Bressan, D., Mulvey, C. M., Qosaj, F., Becker, R., Grimaldi, F., Coffey, S., et al. (2021). Exploration and Analysis of Molecularly Annotated, 3D Models of Breast Cancer at Single-Cell Resolution Using Virtual Reality. bioRxiv [Preprint] 28, 448342. doi:10.1101/2021.06.28.448342

CrossRef Full Text | Google Scholar

Bryson, S. (1996). Virtual Reality in Scientific Visualization. Commun. ACM 39, 62–71. doi:10.1145/229459.229467

CrossRef Full Text | Google Scholar

Calì, C., Baghabra, J., Boges, D. J., Holst, G. R., Kreshuk, A., Hamprecht, F. A., et al. (2016). Three-dimensional Immersive Virtual Reality for Studying Cellular Compartments in 3D Models from EM Preparations of Neural Tissues. J. Comp. Neurol. 524, 23–38. doi:10.1002/cne.23852

PubMed Abstract | CrossRef Full Text | Google Scholar

Cassidy, K. C., Šefčík, J., Raghav, Y., Chang, A., and Durrant, J. D. (2020). ProteinVR: Web-Based Molecular Visualization in Virtual Reality. Plos Comput. Biol. 16, e1007747. doi:10.1371/journal.pcbi.1007747

PubMed Abstract | CrossRef Full Text | Google Scholar

Chu, S. L., and Quek, F. (2013). “Information Holodeck: Thinking in Technology Ecologies,” in Human-Computer Interaction – INTERACT 2013 – Lecture Notes in Computer Science Book Series. Editors P. Kotzé, G. Marsden, G. Lindgaard, J. Wesson, and M. Winckler (Berlin, Heidelberg: Springer Berlin Heidelberg), Vol. 8117, 167–184. doi:10.1007/978-3-642-40483-2_12

CrossRef Full Text | Google Scholar

Cleveland, W. S., and McGill, R. (1987). Graphical Perception: The Visual Decoding of Quantitative Information on Graphical Displays of Data. J. R. Statistical Society. Series A (General) 150, 192. doi:10.2307/2981473

CrossRef Full Text | Google Scholar

Cooper, S., Khatib, F., Treuille, A., Barbero, J., Lee, J., Beenen, M., et al. (2010). Predicting Protein Structures with a Multiplayer Online Game. Nature 466, 756–760. doi:10.1038/nature09304

PubMed Abstract | CrossRef Full Text | Google Scholar

Cruz-Neira, C., Sandin, D. J., DeFanti, T. A., Kenyon, R. v., and Hart, J. C. (1992). The CAVE: Audio Visual Experience Automatic Virtual Environment. Commun. ACM 35, 64–72. doi:10.1145/129888.129892

CrossRef Full Text | Google Scholar

Cummings, J. J., and Bailenson, J. N. (2016). How Immersive Is Enough? A Meta-Analysis of the Effect of Immersive Technology on User Presence. Media Psychol. 19, 272–309. doi:10.1080/15213269.2015.1015740

CrossRef Full Text | Google Scholar

Das, R., Keep, B., Washington, P., and Riedel-Kruse, I. H. (2019). Scientific Discovery Games for Biomedical Research. Annu. Rev. Biomed. Data Sci. 2, 253–279. doi:10.1146/annurev-biodatasci-072018-021139

PubMed Abstract | CrossRef Full Text | Google Scholar

de Faria, J. W., Teixeira, M. J., de Moura Sousa Júnior, L., Otoch, J. P., and Figueiredo, E. G. (2016). Virtual and Stereoscopic Anatomy: when Virtual Reality Meets Medical Education. J. Neurosurg. 125, 1105–1111. doi:10.3171/2015.8.JNS141563

PubMed Abstract | CrossRef Full Text | Google Scholar

Doak, D. G., Denyer, G. S., Gerrard, J. A., Mackay, J. P., and Allison, J. R. (2020). Peppy: A Virtual Reality Environment for Exploring the Principles of Polypeptide Structure. Protein Sci. 29, 157–168. doi:10.1002/pro.3752

PubMed Abstract | CrossRef Full Text | Google Scholar

Eiben, C. B., Siegel, J. B., Bale, J. B., Cooper, S., Khatib, F., Shen, B. W., et al. (2012). Increased Diels-Alderase Activity through Backbone Remodeling Guided by Foldit Players. Nat. Biotechnol. 30, 190–192. doi:10.1038/nbt.2109

PubMed Abstract | CrossRef Full Text | Google Scholar

Ellis, S. R., Tharp, G. K., Grunwald, A. J., and Smith, S. (1991). Exocentric Judgements in Real Environments and Stereoscopic Displays. Proc. Hum. Factors Soc. Annu. Meet. 35, 1442–1446. doi:10.1177/154193129103502005

CrossRef Full Text | Google Scholar

Etemadpour, R., Monson, E., and Linsen, L. (2013). “The Effect of Stereoscopic Immersive Environments on Projection-Based Multi-Dimensional Data Visualization,” in 2013 17th International Conference on Information Visualisation, London, United Kingdom, July 16–18, 2013 (IEEE), 389–397. doi:10.1109/IV.2013.51

CrossRef Full Text | Google Scholar

Eterna (2022). Eterna Solve Puzzles Invent Medicine. Available at: https://eternagame.org (Accessed February 4, 2022).

Google Scholar

EyeWire (2022). EyeWire A Game to Map the Brain. Available at: https://eyewire.org (Accessed February 4, 2022).

Google Scholar

EyeWire Into the Brain (2022). Into the Brain about EyeWire. Available at: https://science.eyewire.org/about (Accessed February 4, 2022).

Google Scholar

Farahani, N., Post, R., Duboy, J., Ahmed, I., Kolowitz, B. J., Krinchai, T., et al. (2016). Exploring Virtual Reality Technology and the Oculus Rift for the Examination of Digital Pathology Slides. J. Pathol. Inform. 7, 22. doi:10.4103/2153-3539.181766

PubMed Abstract | CrossRef Full Text | Google Scholar

Foldit (2022). Foldit Solve Puzzles for Science. Available at: https://fold.it (Accessed February 4, 2022).

Google Scholar

Garcia-Bonete, M. J., Jensen, M., and Katona, G. (2019). A Practical Guide to Developing Virtual and Augmented Reality Exercises for Teaching Structural Biology. Biochem. Mol. Biol. Educ. 47, 16–24. doi:10.1002/bmb.21188

PubMed Abstract | CrossRef Full Text | Google Scholar

Gillette, M. A., Satpathy, S., Cao, S., Dhanasekaran, S. M., Vasaikar, S. V., Krug, K., et al. (2020). Proteogenomic Characterization Reveals Therapeutic Vulnerabilities in Lung Adenocarcinoma. Cell 182, 200–e35. e35. doi:10.1016/j.cell.2020.06.013

PubMed Abstract | CrossRef Full Text | Google Scholar

Greffard, N., Picarougne, F., and Kuntz, P. (2014). “Beyond the Classical Monoscopic 3D in Graph Analytics: An Experimental Study of the Impact of Stereoscopy,” in 2014 IEEE VIS International Workshop on 3DVis (3DVis), Paris, France, November 9, 2014 (IEEE), 19–24. doi:10.1109/3DVis.2014.7160095

CrossRef Full Text | Google Scholar

Horowitz, S., Koepnick, B., Martin, R., Tymieniecki, A., Winburn, A. A., Cooper, S., et al. (2016). Determining crystal Structures through Crowdsourcing and Coursework. Nat. Commun. 7, 12549. doi:10.1038/ncomms12549

PubMed Abstract | CrossRef Full Text | Google Scholar

Jeong, D. C., Kim, S. S. Y., Xu, J. J., and Miller, L. C. (2021). Protean Kinematics: A Blended Model of VR Physics. Front. Psychol. 12, 705170. doi:10.3389/fpsyg.2021.705170

PubMed Abstract | CrossRef Full Text | Google Scholar

Kalayci, S., and Gümüş, Z. H. (2018). Exploring Biological Networks in 3D, Stereoscopic 3D, and Immersive 3D with iCAVE. Curr. Protoc. Bioinformatics 61, 8.26.1–8.27.26. doi:10.1002/cpbi.47

PubMed Abstract | CrossRef Full Text | Google Scholar

Kalayci, S., Petralia, F., Wang, P., and Gümüş, Z. H. (2020). ProNetView-ccRCC: A Web-Based Portal to Interactively Explore Clear Cell Renal Cell Carcinoma Proteogenomics Networks. Proteomics 20, e2000043. doi:10.1002/pmic.202000043

PubMed Abstract | CrossRef Full Text | Google Scholar

Khatib, F., DiMaio, F., Cooper, S., Kazmierczyk, M., Cooper, S., Kazmierczyk, M., et al. (2011). Crystal Structure of a Monomeric Retroviral Protease Solved by Protein Folding Game Players. Nat. Struct. Mol. Biol. 18, 1175–1177. doi:10.1038/nsmb.2119

PubMed Abstract | CrossRef Full Text | Google Scholar

Kingsley, L. J., Brunet, V., Lelais, G., McCloskey, S., Milliken, K., Leija, E., et al. (2019). Development of a Virtual Reality Platform for Effective Communication of Structural Data in Drug Discovery. J. Mol. Graph Model. 89, 234–241. doi:10.1016/j.jmgm.2019.03.010

PubMed Abstract | CrossRef Full Text | Google Scholar

Kockro, R. A., Amaxopoulou, C., Killeen, T., Wagner, W., Reisch, R., Schwandt, E., et al. (2015). Stereoscopic Neuroanatomy Lectures Using a Three-Dimensional Virtual Reality Environment. Ann. Anat. 201, 91–98. doi:10.1016/j.aanat.2015.05.006

PubMed Abstract | CrossRef Full Text | Google Scholar

Kockro, R. A., Stadie, A., Schwandt, E., Reisch, R., Charalampaki, C., Ng, I., et al. (2007). A Collaborative Virtual Reality Environment for Neurosurgical Planning and Training. Neurosurgery 61, 379–391. doi:10.1227/01.neu.0000303997.12645.26

PubMed Abstract | CrossRef Full Text | Google Scholar

Koepnick, B., Flatten, J., Husain, T., Ford, A., Silva, D. A., Bick, M. J., et al. (2019). De Novo protein Design by Citizen Scientists. Nature 570, 390–394. doi:10.1038/s41586-019-1274-4

PubMed Abstract | CrossRef Full Text | Google Scholar

Kruzan, K. P., and Won, A. S. (2019). Embodied Well-Being through Two media Technologies: Virtual Reality and Social media. New Media Soc. 21, 1734–1749. doi:10.1177/1461444819829873

CrossRef Full Text | Google Scholar

Kugler, L. (2021). The State of Virtual Reality Hardware. Commun. ACM 64, 15–16. doi:10.1145/3441290

CrossRef Full Text | Google Scholar

Kwon, O.-H., Muelder, C., Lee, K., and Ma, K.-L. (2015). “Spherical Layout and Rendering Methods for Immersive Graph Visualization,” in 2015 IEEE Pacific Visualization Symposium (PacificVis), Hangzhou, China, April 14–17, 2015 (IEEE), 63–67. doi:10.1109/PACIFICVIS.2015.7156357

CrossRef Full Text | Google Scholar

Kwon, O. H., Muelder, C., Lee, K., and Ma, K. L. (2016). A Study of Layout, Rendering, and Interaction Methods for Immersive Graph Visualization. IEEE Trans. Vis. Comput. Graph 22, 1802–1815. doi:10.1109/TVCG.2016.2520921

PubMed Abstract | CrossRef Full Text | Google Scholar

Lee, K. M. (2004). Presence, Explicated. Commun. Theor. 14, 27–50. doi:10.1111/j.1468-2885.2004.tb00302.x

CrossRef Full Text | Google Scholar

Lee, L.-H., Braud, T., Zhou, P., Wang, L., Xu, D., Lin, Z., et al. (2021). All One Needs to Know about Metaverse: A Complete Survey on Technological Singularity, Virtual Ecosystem, and Research Agenda. [Preprint] https://arxiv.org/abs/2110.05352.

Google Scholar

Legetth, O., Rodhe, J., Lang, S., Dhapola, P., Wallergård, M., and Soneji, S. (2021). CellexalVR: A Virtual Reality Platform to Visualize and Analyze Single-Cell Omics Data. iScience 24, 103251. doi:10.1016/j.isci.2021.103251

PubMed Abstract | CrossRef Full Text | Google Scholar

Leinen, P., Green, M. F., Esat, T., Wagner, C., Tautz, F. S., and Temirov, R. (2015). Virtual Reality Visual Feedback for Hand-Controlled Scanning Probe Microscopy Manipulation of Single Molecules. Beilstein J. Nanotechnol 6, 2148–2153. doi:10.3762/bjnano.6.220

PubMed Abstract | CrossRef Full Text | Google Scholar

Li, H., Leung, K. S., Nakane, T., and Wong, M. H. (2014). Iview: an Interactive WebGL Visualizer for Protein-Ligand Complex. BMC Bioinformatics 15, 56. doi:10.1186/1471-2105-15-56

PubMed Abstract | CrossRef Full Text | Google Scholar

Li, J., Nie, L., Li, Z., Lin, L., Tang, L., and Ouyang, J. (2012). Maximizing Modern Distribution of Complex Anatomical Spatial Information: 3D Reconstruction and Rapid Prototype Production of Anatomical Corrosion Casts of Human Specimens. Anat. Sci. Educ. 5, 330–339. doi:10.1002/ase.1287

PubMed Abstract | CrossRef Full Text | Google Scholar

Liimatainen, K., Latonen, L., Valkonen, M., Kartasalo, K., and Ruusuvuori, P. (2021). Virtual Reality for 3D Histology: Multi-Scale Visualization of Organs with Interactive Feature Exploration. BMC cancer 21, 1133. doi:10.1186/s12885-021-08542-9

PubMed Abstract | CrossRef Full Text | Google Scholar

Liluashvili, V., Kalayci, S., Fluder, E., Wilson, M., Gabow, A., and Gümüs, Z. H. (2017). iCAVE: an Open Source Tool for Visualizing Biomolecular Networks in 3D, Stereoscopic 3D and Immersive 3D. GigaScience 6, 1–13. doi:10.1093/gigascience/gix054

CrossRef Full Text | Google Scholar

Limniou, M., Roberts, D., and Papadopoulos, N. (2008). Full Immersive Virtual Environment CAVETM in Chemistry Education. Comput. Educ. 51, 584–593. doi:10.1016/j.compedu.2007.06.014

CrossRef Full Text | Google Scholar

Luden.io. (2015). InCell VR [Video Game]. Luden.io (Accessed February 4, 2022).

Google Scholar

Maimone, A., and Wang, J. (2020). Holographic Optics for Thin and Lightweight Virtual Reality. ACM Trans. Graph. 39, 67:1–67:14. doi:10.1145/3386569.3392416

CrossRef Full Text | Google Scholar

Marx, V. (2021). Method of the Year: Spatially Resolved Transcriptomics. Nat. Methods 18, 9–14. doi:10.1038/s41592-020-01033-y

PubMed Abstract | CrossRef Full Text | Google Scholar

McIntire, J. P., and Liggett, K. K. (2014). “The (Possible) Utility of Stereoscopic 3D Displays for Information Visualization: The Good, the Bad, and the Ugly,” in 2014 IEEE VIS International Workshop on 3DVis (3DVis), Paris, France, November 9, 2014 (IEEE), 1–9. doi:10.1109/3DVis.2014.7160093

CrossRef Full Text | Google Scholar

McKendrick, R., Parasuraman, R., Murtza, R., Formwalt, A., Baccus, W., Paczynski, M., et al. (2016). Into the Wild: Neuroergonomic Differentiation of Hand-Held and Augmented Reality Wearable Displays during Outdoor Navigation with Functional Near Infrared Spectroscopy. Front. Hum. Neurosci. 10, 216. doi:10.3389/fnhum.2016.00216

PubMed Abstract | CrossRef Full Text | Google Scholar

Mikropoulos, T. A., and Natsis, A. (2011). Educational Virtual Environments: A Ten-Year Review of Empirical Research (1999-2009). Comput. Educ. 56, 769–780. doi:10.1016/j.compedu.2010.10.020

CrossRef Full Text | Google Scholar

Mirhosseini, K., Sun, Q., Gurijala, K. C., Laha, B., and Kaufman, A. E. (2014). “Benefits of 3D Immersion for Virtual Colonoscopy,” in IEEE VIS International Workshop on 3DVis (3DVis) (IEEE), 75–79. doi:10.1109/3DVis.2014.7160105

CrossRef Full Text | Google Scholar

Morimoto, J., and Ponton, F. (2021). Virtual Reality in Biology: Could We Become Virtual Naturalists. Evo Edu Outreach 14, 7. doi:10.1186/s12052-021-00147-x

CrossRef Full Text | Google Scholar

Nakano, C. M., Moen, E., Byun, H. S., Ma, H., Newman, B., McDowell, A., et al. (2016). iBET: Immersive Visualization of Biological Electron-Transfer Dynamics. J. Mol. Graph Model. 65, 94–99. doi:10.1016/j.jmgm.2016.02.009

PubMed Abstract | CrossRef Full Text | Google Scholar

Norrby, M., Grebner, C., Eriksson, J., and Boström, J. (2015). Molecular Rift: Virtual Reality for Drug Designers. J. Chem. Inf. Model. 55, 2475–2484. doi:10.1021/acs.jcim.5b00544

PubMed Abstract | CrossRef Full Text | Google Scholar

Open Hybrid (2022). Open Hybrid. Available at: http://openhybrid.org (Accessed February 4, 2022).

Google Scholar

Papadopoulos, C., Petkov, K., Kaufman, A. E., and Mueller, K. (2015). The Reality Deck-Aan Immersive Gigapixel Display. IEEE Comput. Graph Appl. 35, 33–45. doi:10.1109/MCG.2014.80

PubMed Abstract | CrossRef Full Text | Google Scholar

Petralia, F., Tignor, N., Reva, B., Koptyra, M., Chowdhury, S., Rykunov, D., et al. (2020). Integrated Proteogenomic Characterization across Major Histological Types of Pediatric Brain Cancer. Cell 183, 1962–1985. e31. doi:10.1016/j.cell.2020.10.044

PubMed Abstract | CrossRef Full Text | Google Scholar

Plato Shorey, P. (1930). The Republic/Plato ; with an English Translation by Paul Shorey. London: W. Heinemann.

Google Scholar

Probst, D., and Reymond, J. L. (2018). Exploring DrugBank in Virtual Reality Chemical Space. J. Chem. Inf. Model. 58, 1731–1735. doi:10.1021/acs.jcim.8b00402

PubMed Abstract | CrossRef Full Text | Google Scholar

Ratamero, E. M., Bellini, D., Dowson, C. G., and Römer, R. A. (2018). Touching Proteins with Virtual Bare Hands : Visualizing Protein-Drug Complexes and Their Dynamics in Self-Made Virtual Reality Using Gaming Hardware. J. Comput. Aided Mol. Des. 32, 703–709. doi:10.1007/s10822-018-0123-0

PubMed Abstract | CrossRef Full Text | Google Scholar

Ravassard, P., Kees, A., Willers, B., Ho, D., Aharoni, D. A., Cushman, J., et al. (2013). Multisensory Control of Hippocampal Spatiotemporal Selectivity. Science 340, 1342–1346. doi:10.1126/science.1232655

PubMed Abstract | CrossRef Full Text | Google Scholar

Renner, R. S., Velichkovsky, B. M., and Helmert, J. R. (2013). The Perception of Egocentric Distances in Virtual Environments - A Review. ACM Comput. Surv. 46, 1–40. doi:10.1145/2543581.2543590

CrossRef Full Text | Google Scholar

Rose, D. (2014). Enchanted Objects - Design, Human Desire and the Internet of Things. New York: Scribner.

Google Scholar

Rosen, J. M., Kun, L., Mosher, R. E., Grigg, E., Merrell, R. C., Macedonia, C., et al. (2016). Cybercare 2.0: Meeting the challenge of the Global burden of Disease in 2030. Health Technol. (Berl) 6, 35–51. doi:10.1007/s12553-016-0132-8

PubMed Abstract | CrossRef Full Text | Google Scholar

Sabir, K., Stolte, C., Tabor, B., and O'Donoghue, S. I. (2013). “The Molecular Control Toolkit: Controlling 3D Molecular Graphics via Gesture and Voice,” in 2013 IEEE Symposium on Biological Data Visualization (BioVis), Atlanta, GA, October 13–14, 2013 (IEEE), 49–56. doi:10.1109/BioVis.2013.6664346

CrossRef Full Text | Google Scholar

Salzman, M. C., Dede, C., Loftin, R. B., and Chen, J. (1999). A Model for Understanding How Virtual Reality Aids Complex Conceptual Learning. Presence: Teleoperators & Virtual Environments 8, 293–316. doi:10.1162/105474699566242

CrossRef Full Text | Google Scholar

Satpathy, S., Krug, K., Jean Beltran, P. M., Savage, S. R., Petralia, F., Kumar-Sinha, C., et al. (2021). A Proteogenomic Portrait of Lung Squamous Cell Carcinoma. Cell 184, 4348–e40. e40. doi:10.1016/j.cell.2021.07.016

PubMed Abstract | CrossRef Full Text | Google Scholar

Schadt, E. E., Linderman, M. D., Sorenson, J., Lee, L., and Nolan, G. P. (2010). Computational Solutions to Large-Scale Data Management and Analysis. Nat. Rev. Genet. 11, 647–657. doi:10.1038/nrg2857

PubMed Abstract | CrossRef Full Text | Google Scholar

Schubert, T., Friedmann, F., and Regenbrecht, H. (2001). The Experience of Presence: Factor Analytic Insights. Presence: Teleoperators & Virtual Environments 10, 266–281. doi:10.1162/105474601300343603

CrossRef Full Text | Google Scholar

Shannon, R. J., Deeks, H. M., Burfoot, E., Clark, E., Jones, A. J., Mulholland, A. J., et al. (2021). Exploring Human-Guided Strategies for Reaction Network Exploration: Interactive Molecular Dynamics in Virtual Reality as a Tool for Citizen Scientists. J. Chem. Phys. 155, 154106. doi:10.1063/5.0062517

PubMed Abstract | CrossRef Full Text | Google Scholar

Simpson, R. M., LaViola, J. J., Laidlaw, D. H., Forsberg, A. S., and van Dam, A. (2000). Immersive VR for Scientific Visualization: a Progress Report. IEEE Comput. Grap. Appl. 20, 26–52. doi:10.1109/38.888006

CrossRef Full Text | Google Scholar

Slater, M., Linakis, V., Usoh, M., and Kooper, R. (1996). “Immersion, Presence and Performance in Virtual Environments,” in Proceedings of the ACM Symposium on Virtual Reality Software and Technology - VRST ’96 (New York, USA: ACM Press), 163–172. doi:10.1145/3304181.3304216

CrossRef Full Text | Google Scholar

Slater, M., and Sanchez-Vives, M. v. (2016). Enhancing Our Lives with Immersive Virtual Reality. Front. Robot. AI 3, 74. doi:10.3389/frobt.2016.00074

CrossRef Full Text | Google Scholar

Slater, M., and Wilbur, S. (1997). A Framework for Immersive Virtual Environments (FIVE): Speculations on the Role of Presence in Virtual Environments. Presence: Teleoperators & Virtual Environments 6, 603–616. doi:10.1162/pres.1997.6.6.603

CrossRef Full Text | Google Scholar

Sollenberger, R. L., and Milgram, P. (1993). Effects of Stereoscopic and Rotational Displays in a Three-Dimensional Path-Tracing Task. Hum. Factors 35, 483–499. doi:10.1177/001872089303500306

PubMed Abstract | CrossRef Full Text | Google Scholar

Spark, A., Kitching, A., Esteban-Ferrer, D., Handa, A., Carr, A. R., Needham, L. M., et al. (2020). vLUME: 3D Virtual Reality for Single-Molecule Localization Microscopy. Nat. Methods 17, 1097–1099. doi:10.1038/s41592-020-0962-1

PubMed Abstract | CrossRef Full Text | Google Scholar

Stein, D. F., Chen, H., Vinyard, M. E., Qin, Q., Combs, R. D., Zhang, Q., et al. (2021). singlecellVR: Interactive Visualization of Single-Cell Data in Virtual Reality. Front. Genet. 12, 764170. doi:10.3389/fgene.2021.764170

PubMed Abstract | CrossRef Full Text | Google Scholar

Sutcliffe, A., Gault, B., and Shin, J.-E. (2005). Presence, Memory and Interaction in Virtual Environments. Int. J. Human-Computer Stud. 62, 307–327. doi:10.1016/j.ijhcs.2004.11.010

CrossRef Full Text | Google Scholar

Sutherland, I. E. (1965). “The Ultimate Display,” in Proceedings of the IFIP Congress, New York, NY, May 24–29, 1965 (London: Macmillan), 506–508.

Google Scholar

Sutherland, I. E. (1968). “A Head-Mounted Three Dimensional Display,” in Proceedings of the December 9-11, 1968, fall joint computer conference, part I on - AFIPS ’68 (Fall, part I) (New York, USA: ACM Press), 757. doi:10.1145/1476589.1476686

CrossRef Full Text | Google Scholar

Tarr, M. J., and Warren, W. H. (2002). Virtual Reality in Behavioral Neuroscience and beyond. Nat. Neurosci. 5, 1089–1092. doi:10.1038/nn948

PubMed Abstract | CrossRef Full Text | Google Scholar

The Body VR LLC. (2016). The Body VR: Journey Inside a Cell [Video Game]. The Body VR LLC (Accessed February 4, 2022).

Google Scholar

Todd, H., and Emsley, P. (2021). Development and Assessment of CootVR, a Virtual Reality Computer Program for Model Building. Acta Cryst. Sect D Struct. Biol. 77, 19–27. doi:10.1107/S2059798320013625

CrossRef Full Text | Google Scholar

Tse, C.-M., Li, H., Leung, K.-S., Lee, K.-H., and Wong, M.-H. (2011). “Interactive Drug Design in Virtual Reality,” in 2011 15th International Conference on Information Visualisation, London, United Kingdom, July 13–15, 2011 (IEEE), 226–231. doi:10.1109/IV.2011.72

CrossRef Full Text | Google Scholar

Wang, K., Yap, L. W., Gong, S., Wang, R., Wang, S. J., and Cheng, W. (2021). Nanowire‐Based Soft Wearable Human-Machine Interfaces for Future Virtual and Augmented Reality Applications. Adv. Funct. Mater. 31, 2008347. doi:10.1002/adfm.202008347

CrossRef Full Text | Google Scholar

Ware, C., and Mitchell, P. (2008). Visualizing Graphs in Three Dimensions. ACM Trans. Appl. Percept. 5, 1–15. doi:10.1145/1279640.1279642

CrossRef Full Text | Google Scholar

Weng, C., Otanga, S., Christianto, S. M., and Chu, R. J.-C. (2020). Enhancing Students' Biology Learning by Using Augmented Reality as a Learning Supplement. J. Educ. Comput. Res. 58, 747–770. doi:10.1177/0735633119884213

CrossRef Full Text | Google Scholar

Westreich, S. T., Nattestad, M., and Meyer, C. (2020). BigTop: a Three-Dimensional Virtual Reality Tool for GWAS Visualization. BMC Bioinformatics 21, 39. doi:10.1186/s12859-020-3373-5

PubMed Abstract | CrossRef Full Text | Google Scholar

Wirth, W., Hartmann, T., Böcking, S., Vorderer, P., Klimmt, C., Schramm, H., et al. (2007). A Process Model of the Formation of Spatial Presence Experiences. Media Psychol. 9, 493–525. doi:10.1080/15213260701283079

CrossRef Full Text | Google Scholar

Wohlgenannt, I., Simons, A., and Stieglitz, S. (2020). Virtual Reality. Bus Inf. Syst. Eng. 62, 455–461. doi:10.1007/s12599-020-00658-9

CrossRef Full Text | Google Scholar

Yallapragada, V. V. B., Xu, T., Walker, S. P., Tabirca, S., and Tangney, M. (2021). Pepblock Builder VR - an Open-Source Tool for Gaming-Based Bio-Edutainment in Interactive Protein Design. Front. Bioeng. Biotechnol. 9, 674211. doi:10.3389/fbioe.2021.674211

PubMed Abstract | CrossRef Full Text | Google Scholar

Yang, A., Yao, Y., Fang, X., Li, J., Xia, Y., Kwok, C. S. M., et al. (2020). starmapVR: Immersive Visualisation of Single Cell Spatial Omic Data. bioRxiv [Preprint]. doi:10.1101/2020.09.01.277079

CrossRef Full Text | Google Scholar

Zhang, J. F., Paciorkowski, A. R., Craig, P. A., and Cui, F. (2019). BioVR: a Platform for Virtual Reality Assisted Biological Data Integration and Visualization. BMC Bioinformatics 20, 78. doi:10.1186/s12859-019-2666-z

PubMed Abstract | CrossRef Full Text | Google Scholar

Keywords: virtual reality, augmented reality, visualization design, 3D, immersive 3D, multi-omics visualization, CAVE, systems biology

Citation: Turhan B and Gümüş ZH (2022) A Brave New World: Virtual Reality and Augmented Reality in Systems Biology. Front. Bioinform. 2:873478. doi: 10.3389/fbinf.2022.873478

Received: 10 February 2022; Accepted: 02 March 2022;
Published: 06 April 2022.

Edited by:

Sean O’Donoghue, Garvan Institute of Medical Research, Australia

Reviewed by:

Jan Egger, University Hospital Essen, Germany

Copyright © 2022 Turhan and Gümüş. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Zeynep H. Gümüş, zeynep.gumus@mssm.edu

Download