Event Abstract

Audiovisual integration in speech perception

  • 1 Department of Psychophysiology, Research Institute for Psychology, Hungary
  • 2 Biophysics Department, KFKI Research Institute for Particle and Nuclear Physics, Hungary
  • 3 Department of Cognitive Psychology, Institute of Psychology, Eötvös Loránd University, Hungary
  • 4 Department of Information Systems, Eötvös Loránd University of Sciences, Hungary

Speech comprehension is significantly improved by visual information originating from the inspection of the speaker's mouth movements. Audiovisual integration necessary for the existence of this phenomenon is often studied in EEG experiments by the comparison of the event related potential (ERP) evoked by a bimodal stimulus and the sum of ERPs triggered by the auditory and the visual modalities (AV versus A+V comparison). However, due to ERP components common to all these three types of stimulus, this analysis can only be applied to the first 200 ms poststimulus latency, and subsequent components (e.g. auditory P2 component) can not be studied by this method. Moreover, spurious results might emerge also in the analyzed 200 ms range due to common early anticipatory potentials. In order to circumvent methodological problems related to the AV versus A+V method, we suggest a different approach for the investigation of audiovisual integration: temporally slightly asynchronous bimodal syllable was presented to the subjects, which leaded to the development of either a fused or an unfused percept. ERPs belonging to the two different perceptions were compared. We found that components corresponding to both auditory N1 and P2 waves were smaller in case of the fused perception and the N1 effect showed a clearly right hemisphere dominance while the effect around the P2 peak was most pronounced on central electrodes. These results are in accordance with previous studies suggesting suppression of N1 generator activities during speech-related audiovisual interaction and suggest a similar phenomenon to subsequent processing stages too.

Conference: 12th Meeting of the Hungarian Neuroscience Society, Budapest, Hungary, 22 Jan - 24 Jan, 2009.

Presentation Type: Poster Presentation

Topic: Behavioural neuroscience

Citation: Huhn Z, Szirtes G, Lorincz A and Csepe V (2009). Audiovisual integration in speech perception. Front. Syst. Neurosci. Conference Abstract: 12th Meeting of the Hungarian Neuroscience Society. doi: 10.3389/conf.neuro.01.2009.04.090

Copyright: The abstracts in this collection have not been subject to any Frontiers peer review or checks, and are not endorsed by Frontiers. They are made available through the Frontiers publishing platform as a service to conference organizers and presenters.

The copyright in the individual abstracts is owned by the author of each abstract or his/her employer unless otherwise stated.

Each abstract, as well as the collection of abstracts, are published under a Creative Commons CC-BY 4.0 (attribution) licence (https://creativecommons.org/licenses/by/4.0/) and may thus be reproduced, translated, adapted and be the subject of derivative works provided the authors and Frontiers are attributed.

For Frontiers’ terms and conditions please see https://www.frontiersin.org/legal/terms-and-conditions.

Received: 03 Mar 2009; Published Online: 03 Mar 2009.

* Correspondence: Zsofia Huhn, Department of Psychophysiology, Research Institute for Psychology, Budapest, Hungary, huhn.zsofia@gmail.com