Event Abstract

Interactive Light Stimulus Generation with High Performance Real-time Image Processing and Simple Scripting

  • 1 Ecole Polytechnique Federale de Lausanne, Laboratory of Nanostructures and Novel Electronic Materials, Switzerland
  • 2 Budapest University of Technology and Economics, Computer Graphics , Hungary
  • 3 NMI at the University Tuebingen, Neurochip Research, Germany

Motivation Light stimulation with precise spatial, temporal and spectral characteristics is demanded by a series of research fields like optogenetics, retina physiology, ophtamology, and psychophysics. The growing demand is fulfilled by several free and commercial stimulus-generating software. The most notable of them are Psychtoolbox, PsychoPy, VisionEgg, OpenSesame, E-Prime, and Presentation, all with different benefits and disadvantages [1, 2]. Undoubtedly, there is a need for a flexible, universal, and easy-to-use solution with high computational power. Material and Methods We developed a new software system, called GEARS, GPU-based Eye And Retina Stimulation software (www.gears.vision), that fulfills the demands of the main application areas, offers access to GPU computing power for tasks like en-masse random number generation or real-time image processing, and does not require deep programming skills for stimulus development. Our solution is based on a new computational workflow model which enables the straightforward implementation of a wide range of stimuli, but hides the details of the underlying software and hardware components. To customize the workflow, a visually aided scripting interface has been developed. The scripts for experiment definition are designed to look as simple as static configuration files, but enabling access to the full power of Python and GPU computing. The software has been applied for recordings of rodent retina interfaced to several MEA types. Recorded data will be presented. Results The software is a C++/Python hybrid using OpenGL graphics (see top figure for the software stack), that generates GLSL shaders dynamically. Similar to computer game engines, elements of the rendering software underlying the stimuli can be assembled using a component-based system, which consists of a large number of elementary building blocks. Within the Python layer, the components are applied to assemble the workflow via calls to the C++ layer. Afterwards, stimulus rendering runs natively, with direct operating system calls. GEARS does not have to use intermediary libraries for window management or Python-OpenGL interfacing (e.g. Qt, PyOpenGL, PyGame). This allows us to sidestep overhead or limitations of existing APIs, e.g. for combining video rendering with OpenGL. Discussion Executable spatio-temporal light patterns, referred to as experiments, consist of elementary stimuli. The visually aided scripting interface allows easy and fast assembing of the experiments, as well as the parametrization of the stimuli. New stimuli can be defined using, among others, components of the following types (see bottom figure for the computation workflow): PRNG - implements the strategy for generating random numbers Shape - defines spatial masks, including circles, rectanges or the full-field, but also editable free-form shapes; and manages images, videos, 3D rendered content, patterns, and gratings Motion - determines how the shapes move, e.g. linear sweep or shaking Modulation - modulates the intensity in time, e.g. along linear, harmonic or square wave functions Warp - performs geometric distortion or image tiling Spatial - sets the strategy of spatial filtering, which can be performed both in spatial and in frequency domain Temporal - regulates convolution of the image stream with a temporal filter kernel, or channelling it through a linear data processing system Gamma - implements the gamma correction Audio - enables playing sound files of various formats Signal - determines how TTL signals are sent via USB/RS232 ports, for controlling measurement electronics and marking significant stimulus timings for measurement analysis Conclusion We have implemented a universal, flexible and user-friendly software, whith the ability to perform computationally intensive operations like real-time spatial and temporal filtering. Parameters of the stimuli can be varied interactively during the experiments. The software meets the widest spectrum of requirements identified in existing applications. References [1] A. Yoonessi and A. Yoonessi. A Glance at Psychophysics Software Programs. Basic and Clinical Neuroscience, 2(3):73–75, 2011. [2] Software for visual psychophysics: an overview. http://visionscience.com/documents/strasburger/strasburger.html. Accessed: 2016-01-15.

Figure 1

Acknowledgements

This work has been supported by OTKA PD-104710

Conference: MEA Meeting 2016 | 10th International Meeting on Substrate-Integrated Electrode Arrays, Reutlingen, Germany, 28 Jun - 1 Jul, 2016.

Presentation Type: Poster Presentation

Topic: MEA Meeting 2016

Citation: Hantz P, Kacsó Á, Zeck G and Szécsi L (2016). Interactive Light Stimulus Generation with High Performance Real-time Image Processing and Simple Scripting. Front. Neurosci. Conference Abstract: MEA Meeting 2016 | 10th International Meeting on Substrate-Integrated Electrode Arrays. doi: 10.3389/conf.fnins.2016.93.00041

Copyright: The abstracts in this collection have not been subject to any Frontiers peer review or checks, and are not endorsed by Frontiers. They are made available through the Frontiers publishing platform as a service to conference organizers and presenters.

The copyright in the individual abstracts is owned by the author of each abstract or his/her employer unless otherwise stated.

Each abstract, as well as the collection of abstracts, are published under a Creative Commons CC-BY 4.0 (attribution) licence (https://creativecommons.org/licenses/by/4.0/) and may thus be reproduced, translated, adapted and be the subject of derivative works provided the authors and Frontiers are attributed.

For Frontiers’ terms and conditions please see https://www.frontiersin.org/legal/terms-and-conditions.

Received: 22 Jun 2016; Published Online: 24 Jun 2016.

* Correspondence: Dr. Péter Hantz, Ecole Polytechnique Federale de Lausanne, Laboratory of Nanostructures and Novel Electronic Materials, Lausanne, Switzerland, hantz.retina@gmail.com