Event Abstract

MazeSuite 3: A design, presentation and analysis platform for spatial navigation, cognitive neuroscience and neuroengineering applications

Adrian Curtin1, 2* and Hasan Ayaz1, 3, 4
  • 1 School of Biomedical Engineering, Science and Health Systems, Drexel University, United States
  • 2 School of Biomedical Engineering, Shanghai Jiao Tong University, China
  • 3 University of Pennsylvania, Department of Family and Community Health, United States
  • 4 Children's Hospital of Philadelphia, Division of General Pediatrics, United States

MazeSuite is a comprehensive software suite for spatial navigation experiments allowing researchers to build, design and analyze experiments in 3D environments and extend both protocols and functionality using an accessible API. MazeSuite was built to promote easy-to-use development of experimental protocols which can be synchronized with physiological and neuroimaging methods easily and is freely-available online at (http://mazesuite.com). The platform consists of three principal applications: MazeMaker, a tool to build and edit 3D environments, MazeWalker, a presentation tool which the subject uses to navigate and interact with the environment, and MazeAnalyzer, a tool to display, track, and analyze behavioral performance recorded during the experiment (Figure 1). Although the primary focus of the package is the development of spatial navigation experiments, MazeSuite has been applied in many fields of investigation, including neuromarketing, brain-computer interface (BCI) development, cooperative tasks, and neurostimulation. Here, we review some of the principle uses of the software package and present the new feature-sets which have been recently introduced. MazeMaker allows the rapid development of 3D environments in a straight-forward 2-D drawing-style interface. The latest version has introduced the ability to generate 3D circular and rectangular mazes of arbitrary size and complexity, allowing the generation of novel environments within seconds, alongside other new features such as improved tablet support and visual themes. Users may also import custom models, images and audio, which can be presented to the participant and define the properties of these objects such that behavior can be experimentally controlled. In addition, automatic interactions between various elements can be defined so that certain events can be predetermined as is needed. Generated items are built in a readable XML-type format so that these environments and there attributes can be modified outside of the MazeMaker program as necessary. Mazes can also be combined in scripted MazeLists, allowing the sequential presentation of specific mazes as well as text or image displays. A selection of user-built environments is also available on the MazeSuite website (http://mazesuite.com/gallery/) and additional submissions are always welcome. The presentation application, MazeWalker allows the presentation of the 3D environments and MazeList sequences while all interactions from the participant are recorded with high-temporal precision. The MazeWalker application can be synchronized with other acquisition software over multiple communication interfaces (Network (TCP), Serial (COM), Parallel Port (LPT)), allowing the temporal alignment of participant activities with other acquired data from physiological measures such as heart-rate, or neuroimaging equipment such as electroencephalogram (EEG), functional Near Infrared Spectroscopy (fNIRS), and functional magnetic resonance imaging (fMRI). Version 3 of the software introduces multiple new capabilities to the software, including third-person perspectives with a user-defined model as the avatar and multiple new interactive properties which can be defined in MazeMaker. In addition to this, MazeWalker also allows more fine-grained control over environmental elements and user interaction/controls over the local network through the use of an expanded API. This functionality can be used to record, monitor, and manipulate objects in the user environment as needed or serve to build more complicated control schemes which may be based on individual biosignals or other custom control signals. The new MazeAnalyzer allows for a greatly expanded assessment of spatial navigation experiments. Participant behavior during navigation in virtual environments can be overlaid on top of the Mazes for visualization and comparative purposes. Additionally, multiple regions of interest can be defined such a that participant behavior can be quantified throughout the experimental session in terms of distance, time, and the number of times the region has been entered. All these properties are now saved in MazeAnalyzer project files so that entire experiments and experimental conditions can be recorded in a unified way. Since the first introduction(Ayaz, Allen, Platek, & Onaral, 2008), MazeSuite has been frequently used for the study of behavioral navigational experiments, including the study of spatial skill transfer(Akinlofa, Holt, & Elyan, 2014), the effect of spatial-cues in learning an environment(Buckley, Haselgrove, & Smith, 2015; Buckley, Smith, & Haselgrove, 2014), spatial memory and extinction(Luna & Martínez, 2015) as well as long-term memory(Luna, Manzanares-Silva, Rodríguez-González, & López-Cruz, 2018), and correlates of personality with navigation performance(Walkowiak, Foulsham, & Eardley, 2015). Early in development, MazeSuite has also targeted integration with physiological and neurophysiological measures to extend the perspectives of cognitive psychology and additionally for neuroimaging applications(Ayaz, Shewokis, Curtin, et al., 2011). These usages have contributed valuable neuroimaging and physiological perspectives on spatial navigation learning(Shewokis, Ayaz, Curtin, Izzetoglu, & Onaral, 2013), error detection in navigation (Holper, Jäger, Scholkmann, & Wolf, 2013), functional connectivity in landmark-based navigation (Raiesdana, 2018), and cooperative navigation (Krill & Platek, 2012). A flexible API for the MazeWalker program has also allowed the software to be used as a prototyping tool for BCI applications, including pioneering work in fNIRS BCI for environmental control(Ayaz, Shewokis, Bunce, & Onaral, 2011), EEG-based navigational control(Curtin, Ayaz, Liu, Shewokis, & Onaral, 2012), hybrid fNIRS/EEG BCI(Liu, Ayaz, Curtin, Onaral, & Shewokis, 2013), and as a prototype for fNIRS-based BCI control of humanoid robots (Batula, Kim, & Ayaz, 2017). Together, these applications show the flexible utility of the platform for behavioral experimentation with adjunctive peripheral measures as well as the use of external measures for control using brain-derived measures. In conclusion, the new generation of MazeSuite offers new capabilities and connectivity which researchers may use to implement experimental protocols. The software is available freely for non-commercial purposes online (http://mazesuite.com). Future development on the software platform will expand analytic capabilities, environmental functionality, and presentation options. Figure 1: MazeSuite is a complete software solution for navigational and spatial neuroscience research. The software allows researchers to prepare (MazeMaker), present (MazeWalker), and analyze (MazeAnalyzer) participant’s behavioral performance across trials inside customized 3D environments.

Figure 1

References

Akinlofa, O. R., Holt, P. O. B., & Elyan, E. (2014). The cognitive benefits of dynamic representations in the acquisition of spatial navigation skills. Computers in Human Behavior, 30, 238–248. https://doi.org/10.1016/j.chb.2013.09.009
Ayaz, H., Allen, S. L., Platek, S. M., & Onaral, B. (2008). Maze Suite 1.0: a complete set of tools to prepare, present, and analyze navigational and spatial cognitive neuroscience experiments. Behavior Research Methods, 40(1), 353–9. Retrieved from http://www.ncbi.nlm.nih.gov/pubmed/18411560
Ayaz, H., Shewokis, P. A., Bunce, S., & Onaral, B. (2011). An optical brain computer interface for environmental control. Conference Proceedings : ... Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Conference, 2011, 6327–30. https://doi.org/10.1109/IEMBS.2011.6091561
Ayaz, H., Shewokis, P. A., Curtin, A., Izzetoglu, M., Izzetoglu, K., & Onaral, B. (2011). Using MazeSuite and functional near infrared spectroscopy to study learning in spatial navigation. Journal of Visualized Experiments : JoVE, (56). https://doi.org/10.3791/3443
Batula, A. M., Kim, Y. E., & Ayaz, H. (2017). Virtual and Actual Humanoid Robot Control with Four-Class Motor-Imagery-Based Optical Brain-Computer Interface. BioMed Research International, 2017. https://doi.org/10.1155/2017/1463512
Buckley, M. G., Haselgrove, M., & Smith, A. D. (2015). The developmental trajectory of intramaze and extramaze landmark biases in spatial navigation: An unexpected journey. Developmental Psychology, 51(6), 771–791. https://doi.org/10.1037/a0039054
Buckley, M. G., Smith, A. D., & Haselgrove, M. (2014). Shape shifting: Local landmarks interfere with navigation by, and recognition of, global shape. Journal of Experimental Psychology: Learning, Memory, and Cognition, 40(2), 492–510. https://doi.org/10.1037/a0034901
Curtin, A., Ayaz, H., Liu, Y., Shewokis, P. A., & Onaral, B. (2012). A P300-based EEG-BCI for spatial navigation control. In 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society (Vol. 2012, pp. 3841–3844). IEEE. https://doi.org/10.1109/EMBC.2012.6346805
Holper, L., Jäger, N., Scholkmann, F., & Wolf, M. (2013). Error detection and error memory in spatial navigation as reflected by electrodermal activity. Cognitive Processing, 14(4), 377–389. https://doi.org/10.1007/s10339-013-0567-z
Krill, A. L., & Platek, S. M. (2012). Working together may be better: Activation of reward centers during a cooperative maze task. PLoS ONE, 7(2), 1–7. https://doi.org/10.1371/journal.pone.0030613
Liu, Y., Ayaz, H., Curtin, A., Onaral, B., & Shewokis, P. A. (2013). Towards a hybrid P300-based BCI using simultaneous fNIR and EEG. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8027 LNAI). https://doi.org/10.1007/978-3-642-39454-6_35
Luna, D., Manzanares-Silva, M., Rodríguez-González, K., & López-Cruz, H. (2018). Long-term spatial memory in humans trained in a virtual maze. Acta Colombiana de Psicología, (January), 70–94. https://doi.org/10.14718/ACP.2018.21.1.4
Luna, D., & Martínez, H. (2015). Spontaneous recovery of human spatial memory in a virtual water maze. Psicológica, 36, 283–308. Retrieved from https://www.uv.es/psicologica/articulos2.15/5LUNA.pdf
Raiesdana, S. (2018). Modeling the interaction of navigational systems in a reward-based virtual navigation task. Journal of Integrative Neuroscience, 17(1), 45–67. https://doi.org/10.3233/JIN-170036
Shewokis, P. A., Ayaz, H., Curtin, A., Izzetoglu, K., & Onaral, B. (2013). Brain in the loop learning using functional near infrared spectroscopy. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8027 LNAI). https://doi.org/10.1007/978-3-642-39454-6_40
Walkowiak, S., Foulsham, T., & Eardley, A. F. (2015). Individual differences and personality correlates of navigational performance in the virtual route learning task. Computers in Human Behavior, 45, 402–410. https://doi.org/10.1016/j.chb.2014.12.041

Keywords: spatial navigation, Software package, behavioral neuroscience, virtual environments, spatial memory

Conference: 2nd International Neuroergonomics Conference, Philadelphia, PA, United States, 27 Jun - 29 Jun, 2018.

Presentation Type: Poster Presentation

Topic: Neuroergonomics

Citation: Curtin A and Ayaz H (2019). MazeSuite 3: A design, presentation and analysis platform for spatial navigation, cognitive neuroscience and neuroengineering applications. Conference Abstract: 2nd International Neuroergonomics Conference. doi: 10.3389/conf.fnhum.2018.227.00069

Copyright: The abstracts in this collection have not been subject to any Frontiers peer review or checks, and are not endorsed by Frontiers. They are made available through the Frontiers publishing platform as a service to conference organizers and presenters.

The copyright in the individual abstracts is owned by the author of each abstract or his/her employer unless otherwise stated.

Each abstract, as well as the collection of abstracts, are published under a Creative Commons CC-BY 4.0 (attribution) licence (https://creativecommons.org/licenses/by/4.0/) and may thus be reproduced, translated, adapted and be the subject of derivative works provided the authors and Frontiers are attributed.

For Frontiers’ terms and conditions please see https://www.frontiersin.org/legal/terms-and-conditions.

Received: 09 Apr 2018; Published Online: 27 Sep 2019.

* Correspondence: Mr. Adrian Curtin, School of Biomedical Engineering, Science and Health Systems, Drexel University, Philadelphia, United States, abc48@drexel.edu