Event Abstract

Virtual Brains: Using immersive virtual reality to enhance neuroscience research and instruction

  • 1 University of Wisconsin - Madison, Department of Psychology, United States
  • 2 University of Wisconsin, Madison, Wisconsin Institute for Discovery, United States

Introduction Virtual Reality (VR) provides novel and exciting opportunities for research and education. It can immerse people in three-dimensional environments that are otherwise inaccessible, such as far away galaxies, the human brain, or a single molecule. Navigating through such virtual environments can provide natural, compelling ways to educate students about the inner workings of these complex systems. There are, however, multiple barriers to adopting VR for educational purposes. First, novice users can report poor, uncompelling experiences due to perceived lack of immersion. Second, users often have difficulty navigating and remaining oriented in virtual environments. Here we offer solutions that will facilitate the development of virtual reality-based instructional methods. Solving the lack of immersion The first part of the talk explores the hypothesis that novice users experience a lack of immersion due to poor recruitment of sensory signals in VR. In our natural environment many signals inform us about the position of objects relative to the self, such as binocular disparity and motion parallax. Unfortunately, our understanding of the contribution of these signals to visual perception is limited because typical psychophysical experiments eliminate these cues by presenting stimuli on flat computer displays and fixing head position. Understanding how these cues contribute to perception in VR, where users are free to move around, will help make their experience more immersive. To test our hypothesis, we used the Oculus Rift, a VR head-mounted display with head-tracking functionality. This VR display allows us to present visual stimuli under naturalistic viewing conditions while maintaining tight experimental control. We designed ‘3D Pong’, a video-game inspired perceptual experiment. We asked users to adjust the position of a paddle so that it would intercept a ball moving in a random direction in 3D space. In one condition, participants were provided with binocular disparity, but not motion parallax cues, in a second condition both cues were available. To prevent users from simply adjusting paddle position at the last moment, we removed the ball from view well before it reached the paddle. We found that many novice users did not take full advantage of all available signals. They showed poor perceptual performance suggesting poor sensitivity to binocular disparity and motion parallax (Figure 1). Moreover, repeated exposure did little to improve performance (Figure 1a). In a separate group of novice VR users, we encouraged recruitment of binocular disparity and motion parallax by incorporating explicit task feedback. After users adjusted paddle position, we provided visual and auditory feedback indicating whether their adjustment resulted in a ‘hit’ or a ‘miss’. With explicit feedback, users learned to incorporate the available cues. Appropriate recruitment of binocular disparity signals was evident in a reduction of adjustment errors compared to the no-feedback group. And, recruitment of motion parallax signals was evident in an increase of ball intercepts when motion parallax signals were available compared to when they were not (Figure 1b). These results show that novice users do not appropriately incorporate the available sensory signals in VR environments at first. However, they can learn which sensory cues are informative through explicit feedback. Interestingly, signal recruitment appears to be implicit. Users frequently reported being unaware they moved their heads during the experiment, even though the resulting motion parallax improved their performance. These results suggest novice users should be introduced to VR by way of dynamic interaction rather than passive viewing. Improving navigation in virtual reality The second part of the talk discusses our efforts to develop VR methods for immersive neuroscience education. A major problem in traditional anatomy education is how to convey 3D volumetric structures and the multidirectional pathways that connect them (Preece et al., 2013; Drapkin et al., 2015). Anatomy instruction typically uses 2D drawings and 3D cadaver dissections. However, 2D drawings are limited by their inability to be viewed from different perspectives, and cadaver dissections are limited by their high cost, low accessibility, and low reusability (Preece et al., 2013). Immersive VR solves these limitations by allowing students to manipulate 3D volumetric forms in enriched stereoscopic viewing conditions (Luursema, et al., 2008), and supporting interactivity (e.g., displaying functional implications after lesioning a brain region). Our approach is motivated by the simple notion that humans have difficulty learning lists of facts and definitions, but they are incredibly adept at learning their way around new environments (e.g., buildings, campuses, and cities; Kelley & Gibson, 2007). By enabling students to explore the brain in a virtual environment, we can shift learning demands from rote memorization to systems involved in navigation and mental map formation. While VR-based instruction affords clear benefits, novice users often find it difficult to navigate and remain oriented in virtual environments. To address this challenge, we are investigating the use of color cues to create virtual environments that are easy to navigate without getting lost (Figure 2). We draw from a rich literature on color data visualization. Previous research has mostly focused on 2D graphs and maps (e.g., Lin et al., 2013; Gramazio et al., 2014; Schloss et al., 2015), and we are translating these principles to the interpretation of direction in space within 3D environments. By understanding intuitions for the directional cues colors provide in 3D environments, we can make immersive environments easier to navigate. We are testing navigation in a virtual game we have developed, called “BrainWalk”. Participants are asked to navigate through the brain by visiting a sequence of waypoints (orbs) indicated on a separate 2D map. We are currently evaluating how coloration of brain elements influences the time it takes participants to successfully navigate the brain. Future Directions Although VR is currently expensive and requires dedicated hardware, we ultimately aim to scale the resulting educational tools to more accessible and affordable VR technologies (e.g., cell phone based solutions such as Google Daydream and Samsung Gear VR). Our results will facilitate VR-based education well beyond neuroscience, extending broadly from astronomy to microbiology.

Figure 1
Figure 2

Keywords: virtual reality, Motion Perception, 3D vision, Neuroanatomy, Perceptual Learning, Feedback

Conference: 2nd International Conference on Educational Neuroscience, Abu Dhabi, United Arab Emirates, 5 Mar - 6 Mar, 2017.

Presentation Type: Talks (for invited speakers only)

Topic: Educational Neuroscience

Citation: Rokers B, Miller N, Fulvio J, Smith S, Tredinnick R, Racey C and Schloss K (2017). Virtual Brains: Using immersive virtual reality to enhance neuroscience research and instruction. Conference Abstract: 2nd International Conference on Educational Neuroscience. doi: 10.3389/conf.fnhum.2017.222.00006

Copyright: The abstracts in this collection have not been subject to any Frontiers peer review or checks, and are not endorsed by Frontiers. They are made available through the Frontiers publishing platform as a service to conference organizers and presenters.

The copyright in the individual abstracts is owned by the author of each abstract or his/her employer unless otherwise stated.

Each abstract, as well as the collection of abstracts, are published under a Creative Commons CC-BY 4.0 (attribution) licence (https://creativecommons.org/licenses/by/4.0/) and may thus be reproduced, translated, adapted and be the subject of derivative works provided the authors and Frontiers are attributed.

For Frontiers’ terms and conditions please see https://www.frontiersin.org/legal/terms-and-conditions.

Received: 06 Feb 2017; Published Online: 11 Dec 2017.

* Correspondence: Dr. Bas Rokers, University of Wisconsin - Madison, Department of Psychology, Madison, WI, 53706, United States, rokers@wisc.edu