Event Abstract

Decoding complex sounds from auditory cortex responses without knowledge of precise stimulus timing

  • 1 Max Planck Institute for Biological Cybernetics, Dept. Logothetis, Germany
  • 2 Italian Institute of Technology, RBCS Department, Italy
  • 3 University of Manchester, Division of Imaging Science and Biomedical Engineering, United Kingdom

Sensory systems can recognize complex stimuli even when these appear at unexpected times. While much work quantifies the stimulus information carried by neural responses, most studies measure spike times with respect to a stimulus-related time frame, and thus implicitly assume a precise ‘clock’ registering the timing of sensory and neural events (Panzeri TINS 10). However, sensory systems may not have such a clock, and likely rely on intrinsic mechanisms to measure the timing of sensory and neural events.

This raises the questions of how well different sensory stimuli can be discriminated in the absence of a perfect clock, and what neural codes could mediate ‘clock-free’ sensory representations. Addressing these questions, we used the primate auditory cortex as a model system and investigated the information carried by different putative neural codes that do not rely on the precise knowledge of stimulus timing.

Our approach builds on the observation that in auditory cortex theta rhythm network activity is entrained by complex sounds (Kayser Neuron 09). As such, the phase of slow rhythms can serve as an intrinsic temporal frame of reference. We show that a spike pattern code based on inter-spike interval (ISI) distributions can discriminate different complex sounds, and does so significantly better when referenced to the local theta rhythm. More specifically, we used 200 ms long sliding windows to mimic uncertainty about stimulus onset, and within each window discriminated different naturalistic sounds using either spike counts, ISI distributions and combined ISI and theta-phase distributions. The combined ISI-phase code provided about a 50% increase of stimulus-related information compared to spike counts. These results suggest that combining different intrinsic time scales, such as ISI’s and slow rhythms, allows the construction of neural codes that carry considerable information about sensory stimuli without making reference to the timing of external events.

Keywords: computational neuroscience

Conference: Bernstein Conference on Computational Neuroscience, Berlin, Germany, 27 Sep - 1 Oct, 2010.

Presentation Type: Presentation

Topic: Bernstein Conference on Computational Neuroscience

Citation: Kayser C, Panzeri S and Logothetis NK (2010). Decoding complex sounds from auditory cortex responses without knowledge of precise stimulus timing. Front. Comput. Neurosci. Conference Abstract: Bernstein Conference on Computational Neuroscience. doi: 10.3389/conf.fncom.2010.51.00008

Copyright: The abstracts in this collection have not been subject to any Frontiers peer review or checks, and are not endorsed by Frontiers. They are made available through the Frontiers publishing platform as a service to conference organizers and presenters.

The copyright in the individual abstracts is owned by the author of each abstract or his/her employer unless otherwise stated.

Each abstract, as well as the collection of abstracts, are published under a Creative Commons CC-BY 4.0 (attribution) licence (https://creativecommons.org/licenses/by/4.0/) and may thus be reproduced, translated, adapted and be the subject of derivative works provided the authors and Frontiers are attributed.

For Frontiers’ terms and conditions please see https://www.frontiersin.org/legal/terms-and-conditions.

Received: 09 Sep 2010; Published Online: 22 Sep 2010.

* Correspondence: Dr. Christoph Kayser, Max Planck Institute for Biological Cybernetics, Dept. Logothetis, Tübingen, Germany, christoph.kayser@uni-bielefeld.de