Event Abstract

Reconstruction of head-movements during optokinetic reflex behavior from a video camera stream

  • 1 Carl-von-Ossietzky-University of Oldenburg, AG Neurobiologie, Germany

Behavioral tests provide important information for neuroscience. To evaluate neural codes the behavioral performance of an animal can serve as a criterion: the performance of a hypothesized decoding principle must be at least as effective as the animal’s behavioral performance.

Most animals compensate a global movement of their visual environment by moving their eyes and head to yield a steady retinal image of the world. Such compensation movements can be triggered by presenting moving regular stripe-patterns. The resulting involuntary head- and eye-movement has been observed in several animals and is known as optokinetic reflex (OKR). In this study, we describe an experimental setup for presenting OKR-inducing 360° stimuli while simultaneously recording the animals behavior with a camera under infra-red (IR) illumination. An IR reflective stripe attached to the animal’s head serves as a marker which is detected in an offline image processing procedure applied to the video stream after recording. Finally, the image processing determines the angular displacement of the animal’s head over time. By this means it is possible to determine if the angular velocity components of the animal’s head-movements are similar to that present in the moving stripe-pattern stimulus. This information can be used to evaluate different stimulus coding principles of retinal ganglion cells later on.

We used this approach to characterize OKR behavior in bird (Erithacus rubecula), mouse (Mus musculus) and turtle (Trachemys scripta elegans).

Conference: Neuroinformatics 2009, Pilsen, Czechia, 6 Sep - 8 Sep, 2009.

Presentation Type: Poster Presentation

Topic: General neuroinformatics

Citation: Ahlers M, Kretschmer F, Meinhart D, Landgraf I, Kretzberg J and Ammermuller J (2019). Reconstruction of head-movements during optokinetic reflex behavior from a video camera stream. Front. Neuroinform. Conference Abstract: Neuroinformatics 2009. doi: 10.3389/conf.neuro.11.2009.08.083

Copyright: The abstracts in this collection have not been subject to any Frontiers peer review or checks, and are not endorsed by Frontiers. They are made available through the Frontiers publishing platform as a service to conference organizers and presenters.

The copyright in the individual abstracts is owned by the author of each abstract or his/her employer unless otherwise stated.

Each abstract, as well as the collection of abstracts, are published under a Creative Commons CC-BY 4.0 (attribution) licence (https://creativecommons.org/licenses/by/4.0/) and may thus be reproduced, translated, adapted and be the subject of derivative works provided the authors and Frontiers are attributed.

For Frontiers’ terms and conditions please see https://www.frontiersin.org/legal/terms-and-conditions.

Received: 22 May 2009; Published Online: 09 May 2019.

* Correspondence: Malte Ahlers, Carl-von-Ossietzky-University of Oldenburg, AG Neurobiologie, Oldenburg, Germany, m.ahlers@uni-oldenburg.de