PERSPECTIVE article

Front. Bioinform., 06 April 2022

Sec. Data Visualization

Volume 2 - 2022 | https://doi.org/10.3389/fbinf.2022.873478

A Brave New World: Virtual Reality and Augmented Reality in Systems Biology

  • 1. Department of Genetics and Genomic Sciences, Icahn School of Medicine at Mount Sinai, New York, NY, United States

  • 2. Faculty of Natural Sciences and Engineering, Sabancı University, Istanbul, Turkey

  • 3. Precision Immunology Institute, Icahn School of Medicine at Mount Sinai, New York, NY, United States

Article metrics

View details

14

Citations

7,2k

Views

2,3k

Downloads

Abstract

How we interact with computer graphics has not changed significantly from viewing 2D text and images on a flatscreen since their invention. Yet, recent advances in computing technology, internetworked devices and gaming are driving the design and development of new ideas in other modes of human-computer interfaces (HCIs). Virtual Reality (VR) technology uses computers and HCIs to create the feeling of immersion in a three-dimensional (3D) environment that contains interactive objects with a sense of spatial presence, where objects have a spatial location relative to, and independent of the users. While this virtual environment does not necessarily match the real world, by creating the illusion of reality, it helps users leverage the full range of human sensory capabilities. Similarly, Augmented Reality (AR), superimposes virtual images to the real world. Because humans learn the physical world through a gradual sensory familiarization, these immersive visualizations enable gaining familiarity with biological systems not realizable in the physical world (e.g., allosteric regulatory networks within a protein or biomolecular pathways inside a cell). As VR/AR interfaces are anticipated to be explosive in consumer markets, systems biologists will be more immersed into their world. Here we introduce a brief history of VR/AR, their current roles in systems biology, and advantages and disadvantages in augmenting user abilities. We next argue that in systems biology, VR/AR technologies will be most useful in visually exploring and communicating data; performing virtual experiments; and education/teaching. Finally, we discuss our perspective on future directions for VR/AR in systems biology.

Background

We see the world in three dimensions (3D), because we have binocular vision, meaning our left and right eyes see slightly different views of an object-a topic explored since Euclid’s 3rd century BC Optics and physician Galen’s 2nd century AD On the Use of the Different Parts of the Human Body, to da Vinci’s (1452–1519) Trattato della Pittura (Art of Painting). The first device that produced 3D-effects utilized binocularity by using two mirrors centered at 45° reflecting specific images to each eye. Invented in the 1830s by Charles Wheatstone, it was called the reflecting mirror stereoscope, from the Greek skopion and stereo (see solid). Conceptually, Virtual Reality (VR) was first described by computing pioneer Sutherland (Sutherland, 1965), who created arguably the first VR head-mounted display (HMD)—a large and bulky device that required mounting to the ceiling and could cause severe injury if it fell on a user, which was nicknamed the Sword of Damocles (Sutherland, 1968). However, the actual term Virtual Reality was coined later in the 1980s, when Visual Programming Lab of computer scientist Jaron Lanier began producing the first commercially available VR headsets and gloves. Since then, efforts to integrate the human body naturally into the virtual experience have driven significant advances in VR and Augmented Reality (AR) human-computer interfaces (HCIs), which mainly consist of input devices, output displays, and various hardware and software parts. We summarize these technologies in Table 1.

TABLE 1

INPUT TRACKERS
Track user position (x, y, z) coordinates and orientation (yaw, pitch, roll angles) as users move about (either all or some)
“User position” may be body movements, head-rotations, or gestures
Feed tracking information back to the computer for real-time display updates (e.g., 3D mouse (wand) or data gloves)
Types
Generally use 3D computer vision, sync pulses and laser lines, or inertial measurement units to recognize movements, gestures and positioning to achieve intuitive tracking
Full-body Tracking
 e.g., Microsoft Azure Kinect DK (Redmond, WA), Sony PlayStation Camera (San Mateo, CA), OptiTrack (Corvallis, OR), Intel RealSense Depth Cameras (Santa Clara, CA), OpenCV OAK (Palo Alto, CA), HTC Vive Tracker (Taiwan)
Hand & Arm Motion Tracking
 e.g., Sony PlayStation Move (San Mateo, CA), Sony Dualsense Controller (San Mateo, CA), UltraLeap Trackers (United Kingdom)
Eye and Facial Tracking
 e.g., Tobii Face Trackers (Sweden), VIVE Facial Tracker (Taiwan)
Important Features
Update rate (times/second) user position and orientation estimates sent to the computer
Latency user position and orientation recording and transmission time for display response
Accuracy and Resolution of user position and orientation
Range volume within which user position and orientation can be measured
Ease of use, size, and weight
OUTPUT DISPLAYS
Display a three-dimensional virtual world (usually stereoscopic) from the user’s eye positions, to create the perception that the virtual scene is independent of user movements
Types
HMD systems
Higher Resolution (pixel density), high upgrade rate and low latency head-tracking
Render only user views instead of the entire scene, by placing small high-resolution displays directly in front of the user’s eyes, creating immersion at a fraction of the cost of larger displays
 e.g., HTC Vive (Taiwan); Microsoft Hololens (Redmond, WA); Oculus Rift and Quest (Menlo Park, CA); Sony Playstation VR (San Mateo, CA); Valve Index (Bellevue, WA); HP Reverb G2 (Palo Alto, CA)
Commodity 3D TVs/stereo-enabled computers with low-cost 3D glasses and software
Low-cost
Limited to screen area
Human eye can utilize a much larger field-of-view (FOV)
Spatial Displays
 e.g., Sony Spatial Reality Display (San Diego, CA), Acer ConceptD 7 SpatialLabs (Taiwan)
Mobile Devices
 e.g., iOS devices with ARKit (Cupertino, CA), Android devices with ARCore (Mountain View, CA)
HARDWARE AND SOFTWARE
Manage input/output; Analyze incoming data; Compute and render 3D-graphics based on input tracker feedback

Input tracking and output display technologies in consumer-level VR/AR systems.

Notable within these efforts, CAVE Automatic Virtual Environments (CAVEs) were introduced in the 90s (Cruz-Neira et al., 1992) by projecting stereoscopic images on walls and floor of a room-sized cube. A CAVE can be thought of like Star Trek’s Holodeck (Chu and Quek, 2013), though the name refers to the metaphor of Plato’s cave in the Republic (Plato and Shorey, 1930) where a philosopher contemplates perception, reality, and illusion. Inside a CAVE, users wear Liquid-Crystal Display (LCD) shutter glasses and a head-tracker and interact with objects using a wand-like device to gain immersion that utilizes a full range of human vision with much wider field-of-view (FOV) and enhanced perceptive depth and shape perception. Later on, for massive data, large, ultra-high-resolution matrices of multiple displays (either monitors or projectors) called Powerwalls were also developed (Papadopoulos et al., 2015). Powerwalls and CAVEs are attractive as collaborative environments in which several investigators can simultaneously interact with VR objects. Yet, they are expensive to build, maintain and upgrade, occupy considerable space, and require many displays and massive processing power, making them cost-prohibitive. Thus, with these available technologies VR was not commercially popular to go mainstream (Wohlgenannt et al., 2020).

Until recently, VR was even considered a “dead technology” (Slater and Sanchez-Vives, 2016). However, recent technological advances, especially in gaming products including Oculus Rift (Menlo Park, CA), HTC VIVE (Taiwan), and PlayStation VR (San Mateo, CA), have enabled VR to finally have good enough performance at relatively cheaper prices, creating a positive feedback loop between companies that develop more advanced technologies and expanding consumer demand, which led to a tremendous jump in VR technology (Wohlgenannt et al., 2020; Kugler, 2021). For context, commercial VR systems in 2016 required expensive and difficult setups including an HMD headset, controllers, and sensors connected to an external high-graphics computer. However, some current new generation VR sets such as Oculus Quest 2 (Menlo Park, CA) and HTC VIVE Focus series (Taiwan), generally referred as stand-alone VR systems, or all-in-one solutions, do not even depend on any external computer system (Wohlgenannt et al., 2020), increasing accessibility further. Coupled with very recent developments such as the popular metaverse concept that combines multiple elements of VR/AR and internet technologies to achieve an extended reality by blending physical and digital worlds together (Lee et al., 2021), we anticipate that VR/AR technologies will finally go mainstream.

VR/AR in Systems Biology

In systems biology, we often seek to provide new insights that weave data on molecules, pathways, cells and tissues to whole organisms, populations, and ecosystems in multiple time- and length-scales. Furthermore, new high-throughput–omics experimental techniques are producing massive and diverse multi-omics datasets (Gillette et al., 2020; Kalayci et al., 2020; Petralia et al., 2020; Marx, 2021; Satpathy et al., 2021), and detailed data capture is boosted further by advances in supercomputers and tracking sensor technologies. Parallel increases in processor speeds and data storage enable computational analyses of these data (Schadt et al., 2010). New or updated visualization technologies are needed to explore and communicate these datasets. While not new in scientific visualizations (Bryson, 1996; Simpson et al., 2000), VR technologies offer novel avenues to these unprecedented new data communication needs. However, much remains to be answered on when and how to use VR/AR technologies in systems biology.

To help answer these questions for different use cases, we first provide a summary of the advantages and disadvantages of using VR technologies in Table 2. Notably, their greatest advantage is in providing unparalleled presence—the sense of being inside of and interacting with the virtual environment (Slater and Wilbur, 1997; Schubert et al., 2001; Sutcliffe et al., 2005). Therefore, we recommend utilizing VR/AR technologies when spatial presence and immersive interactivity with the content makes a difference in addressing user needs. User comprehension may become somewhat limited in 2D above domain-specific data type and size thresholds, leading to data occlusions. In such cases, immersion enables 3D-navigation and provides the necessary perceptive depth to enhance comprehension. For example, for multi-dimensional data, user studies have reported significantly better performance for immersive 3D vs. 2D environments for certain analysis tasks (Etemadpour et al., 2013). As researchers working at the interface of biology and data visualization, based on our own experience in this domain, and observations on the general trends in VR applications in biology that we briefly summarize here, we anticipate their utilization in three main areas in systems biology: 1) visually exploring and communicating data; 2) performing virtual experiments; and 3) education/teaching and discuss them below.

TABLE 2

ADVANTAGES
1Unparalleled presence—sense of being inside the virtual environment Slater and Wilbur, (1997); Schubert et al. (2001); Sutcliffe et al. (2005). Increased by:
a User movements and interactions Balakrishnan and Sundar, (2011), which enhance learning Cummings and Bailenson, (2016) and working memory-based performance McKendrick et al. (2016)
b User tracking, spectroscopy and wider FOV (significant increase) Cummings and Bailenson, (2016)
c Overall immersion (moderate increase) Cummings and Bailenson, (2016)
d Image quality and resolution (low increase) Lee, (2004)
e Update rate (potentially effective but is less studied)
2Stereoscopic depth cues are beneficial in exploring complex 3D/multidimensional 3D+ data Slater et al. (1996); Greffard et al. (2014); McIntire and Liggett, (2014). 2D-vis cannot provide spatial depth cues McIntire and Liggett, (2014)
 Increased Peripheral vision provides a rich set of cues, permitting more natural and thus quicker interactive explorations of high dimensional and dynamic data
3Full user attention, as users cannot second-screen VR Wirth et al. (2007)
4Complete stimuli control within a safe, standardized, and reproducible environment, offering unprecedented opportunities for studies that cannot be performed in the real world Tarr and Warren, (2002)
5Moving beyond keyboard/mouse, supports natural modes for navigation or gestures such as flying/grasping
6 Users can investigate multiple regions or explore correlations between data that are nearby in space-time without clutter, updated in real-time based on their movements and views
7Collaborative environment. Colleagues at multiple locations can meet, make eye contact (with their avatars), and explore objects together. As big data often involves many researchers in remote locations, this can help tremendously in team efforts
DISADVANTAGES
1 HCI learning curves can pose barriers for new/occasional users (e.g., HMDs limit ability to interact with mouse/keyboard)
2 User isolation from surroundings can cause a lack of situation awareness (imagine wearing an HMD on the subway). While see-through headsets can solve such problems, they are awkward to wear in public
3 Some users may feel motion sickness/nausea, though increasing peripheral vision greatly reduces these
4 Users may experience neck pain and stress if wearing for long periods
5 Some technical & conceptual challenges need to be addressed to improve presence:
a Increased distance perception Renner et al. (2013)
b Quick updates with low latency
c Real-time delivery with high performance
d Rapid feedback on user actions at specific locations
6 VR is unnecessary for systems that are simple, small, where additional depth cues may not be needed, and vital or useful

Advantages and Disadvantages of VR technologies.

Visually Exploring and Communicating Data

Systems biology visualizations are generally abstract representations, far from real-world objects. For instance, cellular pathway graphs are often cartoon representations. Yet, they suffice in helping understand the biological phenomena they represent. In VR environments, users interpret such abstractions as real objects, and the imagery has a lasting impact on our brains. Furthermore, we can interact with the virtual objects in ways that are not possible in the real world at multiple scales, from single molecules (Leinen et al., 2015), protein-drug complexes (Norrby et al., 2015) biomolecular networks (Liluashvili et al., 2017) to organs (Mirhosseini et al., 2014); navigate through them (Bellmund et al., 2016), or dynamics (Nakano et al., 2016). VR technologies thus open the door to benefit diverse phenomena that involve 3D spatial reconstructions, from localizations and dynamics in the human brain (Calì et al., 2016) to interactions in retinal pathology (Aaker et al., 2011) and volumetric studies in digital pathology (Farahani et al., 2016; Liimatainen et al., 2021). More recently, single-molecule localization microscopy (SMLM) in immersive VR has been used to visualize biological structures as point clouds at the molecular scale (e.g., vLUME by Spark et al., 2020). Some of these recent SMLM tools can be further extended to other similarly multidimensional spatially localized point clouds (e.g., Genuage tool by Blanc et al., 2020).

In systems biology, the knowledge, visualization, and exploration of related 3D structural data often play an important role. To explore and understand the function of macromolecules (proteins, RNA, and DNA) and their complexes from their 3D structural models, molecular graphics is shifting to VR (Tse et al., 2011; Nakano et al., 2016; Ratamero et al., 2018; Cassidy et al., 2020; Todd and Emsley, 2021). Some of these applications further integrate relevant structural knowledge with domain-specific (Norrby et al., 2015), or genomics datasets (Zhang et al., 2019). While some molecular viewers work in CAVE environments (Block et al., 2009), others utilize latest HMDs (Leinen et al., 2015), game engines such as Unity and gesture devices such as Kinect and Leap Motion (Probst and Reymond, 2018; Zhang et al., 2019) or even voice recognition (Sabir et al., 2013) to activate commands; or provide web-based VR without head-tracking or advanced interactions (Li et al., 2014; Cassidy et al., 2020). Parallel efforts are also on-going to study the chemical fingerprints of DrugBank compounds in VR environments (Probst and Reymond, 2018).

VR is useful even when exploring abstract systems-level data that do not contain 3D-localizations. For example, researchers often employ complex network visualizations which may require many viewpoints due to clutter, and navigation issues. Several studies suggest stereoscopy alone (Ware and Mitchell, 2008; Greffard et al., 2014; Kwon et al., 2015; Kwon et al., 2016) or combined with rotation (Sollenberger and Milgram, 1993) or motion cues (Ware and Mitchell, 2008) enhances performance in comprehension, helps present graphs better than 2D-displays (Sollenberger and Milgram, 1993) and enables low user error rates (Ware and Mitchell, 2008). For example, Supplementary Figure S1 shows a relatively large network in 2D, and Supplementary Video S1 in 3D. While both are generated using the network visualization tool iCAVE (Liluashvili et al., 2017; Kalayci and Gümüş, 2018), within iCAVE users can further interactively explore the 3D or stereo versions from multiple perspectives. Visualizations from multiple perspectives reportedly make different aspects of a system more salient (Ellis et al., 1991). Similarly, the BigTop tool (Westreich et al., 2020) renders Manhattan plots from genome-wide association studies (GWAS) in 3D.

More recently, VR is used to explore multidimensional -omics datasets in systems biology including cytometry, transcriptomics, epigenomics, proteomics and their combinations, in the form of abstract data clouds. For example, single-cell RNA sequencing data analysis often includes a dimensionality reduction step, where cell populations are projected in 2D or 3D space to explore cellular heterogeneity. Visualizing such datasets in 3D can be more informative as it decreases the possibility of collapsing similar cell types and clusters. A recent tool, CellexalVR, allows visual exploration and analysis of such dimensionality reduction plots and associated metadata in immersive VR (Legetth et al., 2021). Other tools for the same purpose include starmapVR (Yang et al., 2020), singlecellVR (Stein et al., 2021) and Theia (Bressan et al., 2021). These platforms often include additional modalities such as on-the-fly clustering, or visualization of dynamic changes in RNA velocity. While singlecellVR and starmapVR are web applications that enable visualizations using low-cost and easily available VR hardware such as Google Cardboard (Yang et al., 2020; Stein et al., 2021), CellexalVR involves GPU-accelerated performance and in-session on-the-fly calculations. StarmapVR further enables simultaneous visualization of spatial transcriptomics data from matching histological images. We anticipate that in the near future we will witness further developments in VR tools for the visual exploration of spatial transcriptomics datasets.

Performing Virtual Experiments

For research studies that cannot be performed in the real world, VR provides a safe, standardized, and reproducible environment that is as life-like as possible (Tarr and Warren, 2002). In addition, we can break the laws of optics and physics, or disconnect real life sensed by the user’s body from the world the user is experiencing (Tarr and Warren, 2002). Researchers have been using such VR properties to study, modify or enhance behavior. For example, neural processes research that links biology and behavior in different species, from insects to humans (Ravassard et al., 2013; Aghajan et al., 2015; Acharya et al., 2016) has used VR to help understand sensory cues that carry information on the virtual worlds or refine the rules that link a subject’s actions to changes in their world. In addition to assisting in understanding human behavior, research can in turn inform how VR environments can be improved in design for increased human engagement. For example, studies suggest that the socially networked nature of VR should be considered in tool design, as numbers of remote users in virtual spaces increase (Kruzan and Won, 2019; Jeong et al., 2021).

In systems biology, virtual experiments can help develop and improve scientific thinking skills. Virtual reformulations of experiments in the form of games to be solved have already proven efficiency in tackling scientific problems. For example, the tools Foldit (Foldit, 2022) and Eterna (Eterna, 2022) have gamified the protein folding and RNA structure prediction problems (Das et al., 2019), and thereby enabled more individuals to perform virtual experiments by engaging the online gaming community whose members may have little to no scientific knowledge. Yet, these gamers have successfully solved real-world problems in relatively short time scales (Cooper et al., 2010; Khatib et al., 2011; Eiben et al., 2012; Horowitz et al., 2016; Koepnick et al., 2019). Similarly, the tool EyeWire has gamified mapping neural circuits in the brain to understand vision, where players try to virtually map 3D neuron structures to serial electron microscopy image data from animal brains (Das et al., 2019; EyeWire, 2022). This game has so far attracted more than 150 K gamers who helped reveal 6 new neuron types and many undiscovered brain circuits (EyeWire Into the Brain, 2022). VR environments open exciting new possibilities of such gamification of virtual scientific problems both for scientists with little coding experience, as well as for non-scientists, and the communication between these two communities. We have already started to observe the first examples of such tools in computational chemistry, where Shannon et al have introduced molecular dynamics VR game to encourage users to explore the reactivity of a specific chemical system (Shannon et al., 2021), and an intuitive VR platform called Nanome presented by Kingsley et al., where the users explore and modify chemical structures collaboratively to work on structural biology and molecular drug design problems (Kingsley et al., 2019). The next several years will witness an increasing number of tools that gamify virtual experiments in the VR environments. Note that in online gaming communities, VR is increasingly blended with social media functionalities, and thus gamified systems biology VR tools will likely need to consider such additional functionalities that are critical for remote users.

Education/Teaching

VR environments can help learning in systems biology areas that involve complex 3D information (Salzman et al., 1999; Mikropoulos and Natsis, 2011), user-interactivity and/or high computational skills. For example, understanding protein structure traditionally involves physical modeling kits or projections of the 3D structures into 2D. However, the ability to create, alter, and rotate a chemical structure in real time in 3D can make it easier to understand abstract concepts (Limniou et al., 2008). Similarly, annotated 3D web-based anatomy atlases help teach complex structures such as artery networks or bronchial trees. Earlier interactive 3D-renderings of these systems used desktops with standard screens due to the costs of VR processors and displays (Li et al., 2012), while later technologies have enabled stereoscopic immersive 3D with real-time interactivity (Kockro et al., 2007). Randomized user studies show that stereo-enhanced 3D-tools are useful in learning anatomy and are well-received by students (Kockro et al., 2015; de Faria et al., 2016). Similarly, integrating AR technology reportedly has positive impact on student learning in biology (Weng et al., 2020).

Recent technological advances in VR have substantially increased their potential utility in learning. Biological concepts currently constitute ∼5% of academic publications on VR (Morimoto and Ponton, 2021). Relatively popular educational platforms include those that simulate biological and chemical experiments within VR environments, such as VRLab Academy (United Kingdom), Labster (Denmark), and ClassVR (United Kingdom). Advances in gaming have expanded VR applications in education as well. For example, the tool Peppy provides a Unity-based VR gaming engine to understand protein structures and their dynamics in undergraduate biochemistry classes (Doak et al., 2020). Similarly, Pepblock Builder VR tool provides a gamified interface for researchers who are not advanced in the computational skills required for protein design (Yallapragada et al., 2021). Many educational VR experiences exist in popular gaming platforms that recreate biological systems, such as The Body VR, where players move in the bloodstream to observe human cells and learn how organelles function (The Body VR LLC, 2016), and InCell VR, where players fight to stop a virus invasion in human, while learning about cell and organelle microstructures (Luden.io, 2015). We anticipate that gaming-based VR tools will similarly be developed for learning multi-omics datasets at a systems level. Further research will then be necessary to understand the full potential impact of VR in learning systems biology.

Discussion

The recent explosion in VR/AR technologies has coincided with extended work-from-home practices due to the coronavirus disease 2019 global pandemic. These developments have lowered resistance to virtual technologies and in fact created a pressing need for virtual, collaborative workspaces in research. Coupled with the explosive increase in datasets collected from multi-omics high-throughput experiments, VR/AR technologies offer attractive new opportunities for visual data exploration and communication in systems biology. However, to develop the most useful tools, systems biologists need to conduct their own user studies and get familiar with design practices within virtual spaces for improved human perception (Cleveland and McGill, 1987). With deeper understandings of the brain and visual perception, content creation and best practices will be established, and adoption will increase. Further technological improvements (higher frame rates, efficient information storage and rendering; increased data transfer with less bit rates; game engines; graphics cards) will aid challenging visualizations such as dynamic networks or multi-scale systems, integrated with data annotations and on-the-fly calculations. Visualization outcomes will in turn guide future research protocols.

VR application development is already easier with HMDs and input devices that offer game engine-compatible free software development kits. These render information from internetworked devices that collect and exchange data with sensors and network connections (Rose, 2014; Akyildiz et al., 2015; Open Hybrid, 2022) or from integrating multiple technologies (e.g., healthcare in cyberspace (Rosen et al., 2016)). We anticipate that new technologies will further eliminate discomforts and limitations of modern VR headsets such as their bulkiness and weight. Towards this end, current studies include using skin sensory interfaces, such as nanowire-based soft wearable HMIs (Wang et al., 2021), and thin and light-weight holographic optics with high performance sunglasses-like near-eye full-color displays, as developed by Facebook Reality Labs (Maimone and Wang, 2020). Such new technologies may remove the barriers between the virtual and real worlds further by eliminating headset use, thereby converging VR/AR (Kugler, 2021).

In summary, we are currently at a critical juncture for VR/AR use in systems biology, as they are finally on the verge of going mainstream. We anticipate that the current trends towards utilizing low-cost VR/AR systems will continue. Still, for certain research areas, interactive 3D applications in Web3D will likely suffice. For some applications, AR will be preferable, as it allows users to overlap virtual data onto the real world in relatively simpler set-ups such as smartphones, without the need of HMDs, and provide more control of their surroundings (Garcia-Bonete et al., 2019). VR will likely remain advantageous in applications that require better immersion and realism (Garcia-Bonete et al., 2019). Barriers for access to VR/AR visualizations in systems biology will likely remain, however, for researchers from underdeveloped countries, and which will need to be addressed. At the same time, with the increasing trends in gamification within the VR environments (Shannon et al., 2021), barriers for scientists with low computational expertise and non-scientists in conducting their own virtual experiments will decrease. As user community grows and commercial VR/AR technologies expand, we expect the range of their systems biology applications will also continue to grow, opening possibilities for significant advancements in understanding and communicating disease-associated mechanisms, running virtual experiments, and education, and help boost the development of new therapies. Of course, the best way to gauge possibilities is to explore them!

Statements

Data availability statement

The original contributions presented in the study are included in the article/Supplementary Material, further inquiries can be directed to the corresponding author.

Author contributions

ZG conceptualized the study, ZG and BT wrote the original draft, edited, and revised the manuscript, ZG was responsible for supervision, project administration, and funding acquisition.

Funding

This study was supported by the Concern Foundation Conquer Cancer Now award and Cancer Moonshot R33 award # CA263705-01 to ZG.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher’s note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

Supplementary material

The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/fbinf.2022.873478/full#supplementary-material

References

  • 1

    AakerG. D.GraciaL.MyungJ. S.BorcherdingV.BanfelderJ. R.D'AmicoD. J.et al (2011). Volumetric Three-Dimensional Reconstruction and Segmentation of Spectral-Domain OCT. Ophthalmic Surg. Lasers Imaging42, S116S120. 10.3928/15428877-20110627-11

  • 2

    AcharyaL.AghajanZ. M.VuongC.MooreJ. J.MehtaM. R. (2016). Causal Influence of Visual Cues on Hippocampal Directional Selectivity. Cell164, 197207. 10.1016/j.cell.2015.12.015

  • 3

    AghajanZ. M.AcharyaL.MooreJ. J.CushmanJ. D.VuongC.MehtaM. R. (2015). Impaired Spatial Selectivity and Intact Phase Precession in Two-Dimensional Virtual Reality. Nat. Neurosci.18, 121128. 10.1038/nn.3884

  • 4

    AkyildizI.PierobonM.BalasubramaniamS.KoucheryavyY. (2015). The Internet of Bio-Nano Things. IEEE Commun. Mag.53, 3240. 10.1109/MCOM.2015.7060516

  • 5

    BalakrishnanB.SundarS. S. (2011). Where Am I? How Can I Get There? Impact of Navigability and Narrative Transportation on Spatial Presence. Human–Computer Interaction26, 161204. 10.1080/07370024.2011.601689

  • 6

    BellmundJ. L.DeukerL.Navarro SchröderT.DoellerC. F. (2016). Grid-cell Representations in Mental Simulation. eLife5, e17089. 10.7554/eLife.17089

  • 7

    BlancT.el BeheiryM.CaporalC.MassonJ. B.HajjB. (2020). Genuage: Visualize and Analyze Multidimensional Single-Molecule point Cloud Data in Virtual Reality. Nat. Methods17, 11001102. 10.1038/s41592-020-0946-1

  • 8

    BlockJ. N.ZielinskiD. J.ChenV. B.DavisI. W.VinsonE. C.BradyR.et al (2009). KinImmerse: Macromolecular VR for NMR Ensembles. Source Code Biol. Med.4, 3. 10.1186/1751-0473-4-3

  • 9

    BressanD.MulveyC. M.QosajF.BeckerR.GrimaldiF.CoffeyS.et al (2021). Exploration and Analysis of Molecularly Annotated, 3D Models of Breast Cancer at Single-Cell Resolution Using Virtual Reality. bioRxiv [Preprint]28, 448342. 10.1101/2021.06.28.448342

  • 10

    BrysonS. (1996). Virtual Reality in Scientific Visualization. Commun. ACM39, 6271. 10.1145/229459.229467

  • 11

    CalìC.BaghabraJ.BogesD. J.HolstG. R.KreshukA.HamprechtF. A.et al (2016). Three-dimensional Immersive Virtual Reality for Studying Cellular Compartments in 3D Models from EM Preparations of Neural Tissues. J. Comp. Neurol.524, 2338. 10.1002/cne.23852

  • 12

    CassidyK. C.ŠefčíkJ.RaghavY.ChangA.DurrantJ. D. (2020). ProteinVR: Web-Based Molecular Visualization in Virtual Reality. Plos Comput. Biol.16, e1007747. 10.1371/journal.pcbi.1007747

  • 13

    ChuS. L.QuekF. (2013). “Information Holodeck: Thinking in Technology Ecologies,” in Human-Computer Interaction – INTERACT 2013 – Lecture Notes in Computer Science Book Series. Editors KotzéP.MarsdenG.LindgaardG.WessonJ.WincklerM. (Berlin, Heidelberg: Springer Berlin Heidelberg), Vol. 8117, 167184. 10.1007/978-3-642-40483-2_12

  • 14

    ClevelandW. S.McGillR. (1987). Graphical Perception: The Visual Decoding of Quantitative Information on Graphical Displays of Data. J. R. Statistical Society. Series A (General)150, 192. 10.2307/2981473

  • 15

    CooperS.KhatibF.TreuilleA.BarberoJ.LeeJ.BeenenM.et al (2010). Predicting Protein Structures with a Multiplayer Online Game. Nature466, 756760. 10.1038/nature09304

  • 16

    Cruz-NeiraC.SandinD. J.DeFantiT. A.KenyonR. v.HartJ. C. (1992). The CAVE: Audio Visual Experience Automatic Virtual Environment. Commun. ACM35, 6472. 10.1145/129888.129892

  • 17

    CummingsJ. J.BailensonJ. N. (2016). How Immersive Is Enough? A Meta-Analysis of the Effect of Immersive Technology on User Presence. Media Psychol.19, 272309. 10.1080/15213269.2015.1015740

  • 18

    DasR.KeepB.WashingtonP.Riedel-KruseI. H. (2019). Scientific Discovery Games for Biomedical Research. Annu. Rev. Biomed. Data Sci.2, 253279. 10.1146/annurev-biodatasci-072018-021139

  • 19

    de FariaJ. W.TeixeiraM. J.de Moura Sousa JúniorL.OtochJ. P.FigueiredoE. G. (2016). Virtual and Stereoscopic Anatomy: when Virtual Reality Meets Medical Education. J. Neurosurg.125, 11051111. 10.3171/2015.8.JNS141563

  • 20

    DoakD. G.DenyerG. S.GerrardJ. A.MackayJ. P.AllisonJ. R. (2020). Peppy: A Virtual Reality Environment for Exploring the Principles of Polypeptide Structure. Protein Sci.29, 157168. 10.1002/pro.3752

  • 21

    EibenC. B.SiegelJ. B.BaleJ. B.CooperS.KhatibF.ShenB. W.et al (2012). Increased Diels-Alderase Activity through Backbone Remodeling Guided by Foldit Players. Nat. Biotechnol.30, 190192. 10.1038/nbt.2109

  • 22

    EllisS. R.TharpG. K.GrunwaldA. J.SmithS. (1991). Exocentric Judgements in Real Environments and Stereoscopic Displays. Proc. Hum. Factors Soc. Annu. Meet.35, 14421446. 10.1177/154193129103502005

  • 23

    EtemadpourR.MonsonE.LinsenL. (2013). “The Effect of Stereoscopic Immersive Environments on Projection-Based Multi-Dimensional Data Visualization,” in 2013 17th International Conference on Information Visualisation, London, United Kingdom, July 16–18, 2013 (IEEE), 389397. 10.1109/IV.2013.51

  • 24

    Eterna (2022). Eterna Solve Puzzles Invent Medicine. Available at: https://eternagame.org (Accessed February 4, 2022).

  • 25

    EyeWire (2022). EyeWire A Game to Map the Brain. Available at: https://eyewire.org (Accessed February 4, 2022).

  • 26

    EyeWire Into the Brain (2022). Into the Brain about EyeWire. Available at: https://science.eyewire.org/about (Accessed February 4, 2022).

  • 27

    FarahaniN.PostR.DuboyJ.AhmedI.KolowitzB. J.KrinchaiT.et al (2016). Exploring Virtual Reality Technology and the Oculus Rift for the Examination of Digital Pathology Slides. J. Pathol. Inform.7, 22. 10.4103/2153-3539.181766

  • 28

    Foldit (2022). Foldit Solve Puzzles for Science. Available at: https://fold.it (Accessed February 4, 2022).

  • 29

    Garcia-BoneteM. J.JensenM.KatonaG. (2019). A Practical Guide to Developing Virtual and Augmented Reality Exercises for Teaching Structural Biology. Biochem. Mol. Biol. Educ.47, 1624. 10.1002/bmb.21188

  • 30

    GilletteM. A.SatpathyS.CaoS.DhanasekaranS. M.VasaikarS. V.KrugK.et al (2020). Proteogenomic Characterization Reveals Therapeutic Vulnerabilities in Lung Adenocarcinoma. Cell182, 200e35. e35. 10.1016/j.cell.2020.06.013

  • 31

    GreffardN.PicarougneF.KuntzP. (2014). “Beyond the Classical Monoscopic 3D in Graph Analytics: An Experimental Study of the Impact of Stereoscopy,” in 2014 IEEE VIS International Workshop on 3DVis (3DVis), Paris, France, November 9, 2014 (IEEE), 1924. 10.1109/3DVis.2014.7160095

  • 32

    HorowitzS.KoepnickB.MartinR.TymienieckiA.WinburnA. A.CooperS.et al (2016). Determining crystal Structures through Crowdsourcing and Coursework. Nat. Commun.7, 12549. 10.1038/ncomms12549

  • 33

    JeongD. C.KimS. S. Y.XuJ. J.MillerL. C. (2021). Protean Kinematics: A Blended Model of VR Physics. Front. Psychol.12, 705170. 10.3389/fpsyg.2021.705170

  • 34

    KalayciS.GümüşZ. H. (2018). Exploring Biological Networks in 3D, Stereoscopic 3D, and Immersive 3D with iCAVE. Curr. Protoc. Bioinformatics61, 8.26.18.27.26. 10.1002/cpbi.47

  • 35

    KalayciS.PetraliaF.WangP.GümüşZ. H. (2020). ProNetView-ccRCC: A Web-Based Portal to Interactively Explore Clear Cell Renal Cell Carcinoma Proteogenomics Networks. Proteomics20, e2000043. 10.1002/pmic.202000043

  • 36

    KhatibF.DiMaioF.CooperS.KazmierczykM.CooperS.KazmierczykM.et al (2011). Crystal Structure of a Monomeric Retroviral Protease Solved by Protein Folding Game Players. Nat. Struct. Mol. Biol.18, 11751177. 10.1038/nsmb.2119

  • 37

    KingsleyL. J.BrunetV.LelaisG.McCloskeyS.MillikenK.LeijaE.et al (2019). Development of a Virtual Reality Platform for Effective Communication of Structural Data in Drug Discovery. J. Mol. Graph Model.89, 234241. 10.1016/j.jmgm.2019.03.010

  • 38

    KockroR. A.AmaxopoulouC.KilleenT.WagnerW.ReischR.SchwandtE.et al (2015). Stereoscopic Neuroanatomy Lectures Using a Three-Dimensional Virtual Reality Environment. Ann. Anat.201, 9198. 10.1016/j.aanat.2015.05.006

  • 39

    KockroR. A.StadieA.SchwandtE.ReischR.CharalampakiC.NgI.et al (2007). A Collaborative Virtual Reality Environment for Neurosurgical Planning and Training. Neurosurgery61, 379391. 10.1227/01.neu.0000303997.12645.26

  • 40

    KoepnickB.FlattenJ.HusainT.FordA.SilvaD. A.BickM. J.et al (2019). De Novo protein Design by Citizen Scientists. Nature570, 390394. 10.1038/s41586-019-1274-4

  • 41

    KruzanK. P.WonA. S. (2019). Embodied Well-Being through Two media Technologies: Virtual Reality and Social media. New Media Soc.21, 17341749. 10.1177/1461444819829873

  • 42

    KuglerL. (2021). The State of Virtual Reality Hardware. Commun. ACM64, 1516. 10.1145/3441290

  • 43

    KwonO.-H.MuelderC.LeeK.MaK.-L. (2015). “Spherical Layout and Rendering Methods for Immersive Graph Visualization,” in 2015 IEEE Pacific Visualization Symposium (PacificVis), Hangzhou, China, April 14–17, 2015 (IEEE), 6367. 10.1109/PACIFICVIS.2015.7156357

  • 44

    KwonO. H.MuelderC.LeeK.MaK. L. (2016). A Study of Layout, Rendering, and Interaction Methods for Immersive Graph Visualization. IEEE Trans. Vis. Comput. Graph22, 18021815. 10.1109/TVCG.2016.2520921

  • 45

    LeeK. M. (2004). Presence, Explicated. Commun. Theor.14, 2750. 10.1111/j.1468-2885.2004.tb00302.x

  • 46

    LeeL.-H.BraudT.ZhouP.WangL.XuD.LinZ.et al (2021). All One Needs to Know about Metaverse: A Complete Survey on Technological Singularity, Virtual Ecosystem, and Research Agenda. [Preprint] https://arxiv.org/abs/2110.05352.

  • 47

    LegetthO.RodheJ.LangS.DhapolaP.WallergårdM.SonejiS. (2021). CellexalVR: A Virtual Reality Platform to Visualize and Analyze Single-Cell Omics Data. iScience24, 103251. 10.1016/j.isci.2021.103251

  • 48

    LeinenP.GreenM. F.EsatT.WagnerC.TautzF. S.TemirovR. (2015). Virtual Reality Visual Feedback for Hand-Controlled Scanning Probe Microscopy Manipulation of Single Molecules. Beilstein J. Nanotechnol6, 21482153. 10.3762/bjnano.6.220

  • 49

    LiH.LeungK. S.NakaneT.WongM. H. (2014). Iview: an Interactive WebGL Visualizer for Protein-Ligand Complex. BMC Bioinformatics15, 56. 10.1186/1471-2105-15-56

  • 50

    LiJ.NieL.LiZ.LinL.TangL.OuyangJ. (2012). Maximizing Modern Distribution of Complex Anatomical Spatial Information: 3D Reconstruction and Rapid Prototype Production of Anatomical Corrosion Casts of Human Specimens. Anat. Sci. Educ.5, 330339. 10.1002/ase.1287

  • 51

    LiimatainenK.LatonenL.ValkonenM.KartasaloK.RuusuvuoriP. (2021). Virtual Reality for 3D Histology: Multi-Scale Visualization of Organs with Interactive Feature Exploration. BMC cancer21, 1133. 10.1186/s12885-021-08542-9

  • 52

    LiluashviliV.KalayciS.FluderE.WilsonM.GabowA.GümüsZ. H. (2017). iCAVE: an Open Source Tool for Visualizing Biomolecular Networks in 3D, Stereoscopic 3D and Immersive 3D. GigaScience6, 113. 10.1093/gigascience/gix054

  • 53

    LimniouM.RobertsD.PapadopoulosN. (2008). Full Immersive Virtual Environment CAVETM in Chemistry Education. Comput. Educ.51, 584593. 10.1016/j.compedu.2007.06.014

  • 54

    Luden.io. (2015). InCell VR [Video Game]. Luden.io (Accessed February 4, 2022).

  • 55

    MaimoneA.WangJ. (2020). Holographic Optics for Thin and Lightweight Virtual Reality. ACM Trans. Graph.39, 67:167:14. 10.1145/3386569.3392416

  • 56

    MarxV. (2021). Method of the Year: Spatially Resolved Transcriptomics. Nat. Methods18, 914. 10.1038/s41592-020-01033-y

  • 57

    McIntireJ. P.LiggettK. K. (2014). “The (Possible) Utility of Stereoscopic 3D Displays for Information Visualization: The Good, the Bad, and the Ugly,” in 2014 IEEE VIS International Workshop on 3DVis (3DVis), Paris, France, November 9, 2014 (IEEE), 19. 10.1109/3DVis.2014.7160093

  • 58

    McKendrickR.ParasuramanR.MurtzaR.FormwaltA.BaccusW.PaczynskiM.et al (2016). Into the Wild: Neuroergonomic Differentiation of Hand-Held and Augmented Reality Wearable Displays during Outdoor Navigation with Functional Near Infrared Spectroscopy. Front. Hum. Neurosci.10, 216. 10.3389/fnhum.2016.00216

  • 59

    MikropoulosT. A.NatsisA. (2011). Educational Virtual Environments: A Ten-Year Review of Empirical Research (1999-2009). Comput. Educ.56, 769780. 10.1016/j.compedu.2010.10.020

  • 60

    MirhosseiniK.SunQ.GurijalaK. C.LahaB.KaufmanA. E. (2014). “Benefits of 3D Immersion for Virtual Colonoscopy,” in IEEE VIS International Workshop on 3DVis (3DVis) (IEEE), 7579. 10.1109/3DVis.2014.7160105

  • 61

    MorimotoJ.PontonF. (2021). Virtual Reality in Biology: Could We Become Virtual Naturalists. Evo Edu Outreach14, 7. 10.1186/s12052-021-00147-x

  • 62

    NakanoC. M.MoenE.ByunH. S.MaH.NewmanB.McDowellA.et al (2016). iBET: Immersive Visualization of Biological Electron-Transfer Dynamics. J. Mol. Graph Model.65, 9499. 10.1016/j.jmgm.2016.02.009

  • 63

    NorrbyM.GrebnerC.ErikssonJ.BoströmJ. (2015). Molecular Rift: Virtual Reality for Drug Designers. J. Chem. Inf. Model.55, 24752484. 10.1021/acs.jcim.5b00544

  • 64

    Open Hybrid (2022). Open Hybrid. Available at: http://openhybrid.org (Accessed February 4, 2022).

  • 65

    PapadopoulosC.PetkovK.KaufmanA. E.MuellerK. (2015). The Reality Deck-Aan Immersive Gigapixel Display. IEEE Comput. Graph Appl.35, 3345. 10.1109/MCG.2014.80

  • 66

    PetraliaF.TignorN.RevaB.KoptyraM.ChowdhuryS.RykunovD.et al (2020). Integrated Proteogenomic Characterization across Major Histological Types of Pediatric Brain Cancer. Cell183, 19621985. e31. 10.1016/j.cell.2020.10.044

  • 67

    PlatoShoreyP. (1930). The Republic/Plato ; with an English Translation by Paul Shorey. London: W. Heinemann.

  • 68

    ProbstD.ReymondJ. L. (2018). Exploring DrugBank in Virtual Reality Chemical Space. J. Chem. Inf. Model.58, 17311735. 10.1021/acs.jcim.8b00402

  • 69

    RatameroE. M.BelliniD.DowsonC. G.RömerR. A. (2018). Touching Proteins with Virtual Bare Hands : Visualizing Protein-Drug Complexes and Their Dynamics in Self-Made Virtual Reality Using Gaming Hardware. J. Comput. Aided Mol. Des.32, 703709. 10.1007/s10822-018-0123-0

  • 70

    RavassardP.KeesA.WillersB.HoD.AharoniD. A.CushmanJ.et al (2013). Multisensory Control of Hippocampal Spatiotemporal Selectivity. Science340, 13421346. 10.1126/science.1232655

  • 71

    RennerR. S.VelichkovskyB. M.HelmertJ. R. (2013). The Perception of Egocentric Distances in Virtual Environments - A Review. ACM Comput. Surv.46, 140. 10.1145/2543581.2543590

  • 72

    RoseD. (2014). Enchanted Objects - Design, Human Desire and the Internet of Things. New York: Scribner.

  • 73

    RosenJ. M.KunL.MosherR. E.GriggE.MerrellR. C.MacedoniaC.et al (2016). Cybercare 2.0: Meeting the challenge of the Global burden of Disease in 2030. Health Technol. (Berl)6, 3551. 10.1007/s12553-016-0132-8

  • 74

    SabirK.StolteC.TaborB.O'DonoghueS. I. (2013). “The Molecular Control Toolkit: Controlling 3D Molecular Graphics via Gesture and Voice,” in 2013 IEEE Symposium on Biological Data Visualization (BioVis), Atlanta, GA, October 13–14, 2013 (IEEE), 4956. 10.1109/BioVis.2013.6664346

  • 75

    SalzmanM. C.DedeC.LoftinR. B.ChenJ. (1999). A Model for Understanding How Virtual Reality Aids Complex Conceptual Learning. Presence: Teleoperators & Virtual Environments8, 293316. 10.1162/105474699566242

  • 76

    SatpathyS.KrugK.Jean BeltranP. M.SavageS. R.PetraliaF.Kumar-SinhaC.et al (2021). A Proteogenomic Portrait of Lung Squamous Cell Carcinoma. Cell184, 4348e40. e40. 10.1016/j.cell.2021.07.016

  • 77

    SchadtE. E.LindermanM. D.SorensonJ.LeeL.NolanG. P. (2010). Computational Solutions to Large-Scale Data Management and Analysis. Nat. Rev. Genet.11, 647657. 10.1038/nrg2857

  • 78

    SchubertT.FriedmannF.RegenbrechtH. (2001). The Experience of Presence: Factor Analytic Insights. Presence: Teleoperators & Virtual Environments10, 266281. 10.1162/105474601300343603

  • 79

    ShannonR. J.DeeksH. M.BurfootE.ClarkE.JonesA. J.MulhollandA. J.et al (2021). Exploring Human-Guided Strategies for Reaction Network Exploration: Interactive Molecular Dynamics in Virtual Reality as a Tool for Citizen Scientists. J. Chem. Phys.155, 154106. 10.1063/5.0062517

  • 80

    SimpsonR. M.LaViolaJ. J.LaidlawD. H.ForsbergA. S.van DamA. (2000). Immersive VR for Scientific Visualization: a Progress Report. IEEE Comput. Grap. Appl.20, 2652. 10.1109/38.888006

  • 81

    SlaterM.LinakisV.UsohM.KooperR. (1996). “Immersion, Presence and Performance in Virtual Environments,” in Proceedings of the ACM Symposium on Virtual Reality Software and Technology - VRST ’96 (New York, USA: ACM Press), 163172. 10.1145/3304181.3304216

  • 82

    SlaterM.Sanchez-VivesM. v. (2016). Enhancing Our Lives with Immersive Virtual Reality. Front. Robot. AI3, 74. 10.3389/frobt.2016.00074

  • 83

    SlaterM.WilburS. (1997). A Framework for Immersive Virtual Environments (FIVE): Speculations on the Role of Presence in Virtual Environments. Presence: Teleoperators & Virtual Environments6, 603616. 10.1162/pres.1997.6.6.603

  • 84

    SollenbergerR. L.MilgramP. (1993). Effects of Stereoscopic and Rotational Displays in a Three-Dimensional Path-Tracing Task. Hum. Factors35, 483499. 10.1177/001872089303500306

  • 85

    SparkA.KitchingA.Esteban-FerrerD.HandaA.CarrA. R.NeedhamL. M.et al (2020). vLUME: 3D Virtual Reality for Single-Molecule Localization Microscopy. Nat. Methods17, 10971099. 10.1038/s41592-020-0962-1

  • 86

    SteinD. F.ChenH.VinyardM. E.QinQ.CombsR. D.ZhangQ.et al (2021). singlecellVR: Interactive Visualization of Single-Cell Data in Virtual Reality. Front. Genet.12, 764170. 10.3389/fgene.2021.764170

  • 87

    SutcliffeA.GaultB.ShinJ.-E. (2005). Presence, Memory and Interaction in Virtual Environments. Int. J. Human-Computer Stud.62, 307327. 10.1016/j.ijhcs.2004.11.010

  • 88

    SutherlandI. E. (1965). “The Ultimate Display,” in Proceedings of the IFIP Congress, New York, NY, May 24–29, 1965 (London: Macmillan), 506508.

  • 89

    SutherlandI. E. (1968). “A Head-Mounted Three Dimensional Display,” in Proceedings of the December 9-11, 1968, fall joint computer conference, part I on - AFIPS ’68 (Fall, part I) (New York, USA: ACM Press), 757. 10.1145/1476589.1476686

  • 90

    TarrM. J.WarrenW. H. (2002). Virtual Reality in Behavioral Neuroscience and beyond. Nat. Neurosci.5, 10891092. 10.1038/nn948

  • 91

    The Body VR LLC. (2016). The Body VR: Journey Inside a Cell [Video Game]. The Body VR LLC (Accessed February 4, 2022).

  • 92

    ToddH.EmsleyP. (2021). Development and Assessment of CootVR, a Virtual Reality Computer Program for Model Building. Acta Cryst. Sect D Struct. Biol.77, 1927. 10.1107/S2059798320013625

  • 93

    TseC.-M.LiH.LeungK.-S.LeeK.-H.WongM.-H. (2011). “Interactive Drug Design in Virtual Reality,” in 2011 15th International Conference on Information Visualisation, London, United Kingdom, July 13–15, 2011 (IEEE), 226231. 10.1109/IV.2011.72

  • 94

    WangK.YapL. W.GongS.WangR.WangS. J.ChengW. (2021). Nanowire‐Based Soft Wearable Human-Machine Interfaces for Future Virtual and Augmented Reality Applications. Adv. Funct. Mater.31, 2008347. 10.1002/adfm.202008347

  • 95

    WareC.MitchellP. (2008). Visualizing Graphs in Three Dimensions. ACM Trans. Appl. Percept.5, 115. 10.1145/1279640.1279642

  • 96

    WengC.OtangaS.ChristiantoS. M.ChuR. J.-C. (2020). Enhancing Students' Biology Learning by Using Augmented Reality as a Learning Supplement. J. Educ. Comput. Res.58, 747770. 10.1177/0735633119884213

  • 97

    WestreichS. T.NattestadM.MeyerC. (2020). BigTop: a Three-Dimensional Virtual Reality Tool for GWAS Visualization. BMC Bioinformatics21, 39. 10.1186/s12859-020-3373-5

  • 98

    WirthW.HartmannT.BöckingS.VordererP.KlimmtC.SchrammH.et al (2007). A Process Model of the Formation of Spatial Presence Experiences. Media Psychol.9, 493525. 10.1080/15213260701283079

  • 99

    WohlgenanntI.SimonsA.StieglitzS. (2020). Virtual Reality. Bus Inf. Syst. Eng.62, 455461. 10.1007/s12599-020-00658-9

  • 100

    YallapragadaV. V. B.XuT.WalkerS. P.TabircaS.TangneyM. (2021). Pepblock Builder VR - an Open-Source Tool for Gaming-Based Bio-Edutainment in Interactive Protein Design. Front. Bioeng. Biotechnol.9, 674211. 10.3389/fbioe.2021.674211

  • 101

    YangA.YaoY.FangX.LiJ.XiaY.KwokC. S. M.et al (2020). starmapVR: Immersive Visualisation of Single Cell Spatial Omic Data. bioRxiv [Preprint]. 10.1101/2020.09.01.277079

  • 102

    ZhangJ. F.PaciorkowskiA. R.CraigP. A.CuiF. (2019). BioVR: a Platform for Virtual Reality Assisted Biological Data Integration and Visualization. BMC Bioinformatics20, 78. 10.1186/s12859-019-2666-z

Summary

Keywords

virtual reality, augmented reality, visualization design, 3D, immersive 3D, multi-omics visualization, CAVE, systems biology

Citation

Turhan B and Gümüş ZH (2022) A Brave New World: Virtual Reality and Augmented Reality in Systems Biology. Front. Bioinform. 2:873478. doi: 10.3389/fbinf.2022.873478

Received

10 February 2022

Accepted

02 March 2022

Published

06 April 2022

Volume

2 - 2022

Edited by

Sean O’Donoghue, Garvan Institute of Medical Research, Australia

Reviewed by

Jan Egger, University Hospital Essen, Germany

Updates

Copyright

*Correspondence: Zeynep H. Gümüş,

This article was submitted to Data Visualization, a section of the journal Frontiers in Bioinformatics

Disclaimer

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.

Outline

Cite article

Copy to clipboard


Export citation file


Share article

Article metrics