Event Abstract

Integration of auditory and visual information in fast recognition of realistic objects

  • 1 La Salpetrière Hospital, France
  • 2 REVES/INRIA, France
  • 3 IRCAM, France

Recent studies have used complex and meaningful stimuli to investigate the neural mechanisms involved in the processing of sensory information necessary for object perception. However, relatively few studies have focused on behavioural measures of multisensory objects processing. Realistic objects are of particular interest in the study of multisensory integration, since a given object can generally be identified through any of several single sensory modalities. The fact that the same semantic knowledge can be accessed through different modalities allows us to explore the different processing levels that underlie retrieval of supramodal object concepts. Here, we studied the influence of semantic congruence on auditory-visual object recognition in a go/no-go task. Participants were asked to react as fast as possible to a target object presented in the visual and/or the auditory modality, and to inhibit their response to a distractor object. The experiment was run under an immersive and realistic virtual environment including 3D images and free-field audio. Reaction times were significantly shorter for semantically congruent bimodal stimuli than predicted by independent processing of information about the objects presented unimodally. Interestingly, this effect was twice as large as found in previous studies that used information-rich stimuli. The processing of bimodal objects was also influenced by their semantic congruence: reaction times were significantly shorter for semantically congruent bimodal stimuli (i.e., visual and auditory stimuli from the same object target) than for semantically incongruent bimodal stimuli (i.e. target represented in only one sensory modality and distractor presented in the other modality). Importantly, an interference effect was observed (i.e. longer reaction times to semantically incongruent stimuli than to the corresponding unimodal stimulus) only when the distractor was auditory. When the distractor was visual, the semantic incongruence did not impair recognition. Our results show that immersive displays may provide large multimodal integration effects, and reveal a possible asymmetry in the attentional filtering of irrelevant auditory and visual information.

Conference: 10th International Conference on Cognitive Neuroscience, Bodrum, Türkiye, 1 Sep - 5 Sep, 2008.

Presentation Type: Poster Presentation

Topic: Abstracts

Citation: Suied C, Bonneel N and Viaud-Delmon I (2008). Integration of auditory and visual information in fast recognition of realistic objects. Conference Abstract: 10th International Conference on Cognitive Neuroscience. doi: 10.3389/conf.neuro.09.2009.01.281

Copyright: The abstracts in this collection have not been subject to any Frontiers peer review or checks, and are not endorsed by Frontiers. They are made available through the Frontiers publishing platform as a service to conference organizers and presenters.

The copyright in the individual abstracts is owned by the author of each abstract or his/her employer unless otherwise stated.

Each abstract, as well as the collection of abstracts, are published under a Creative Commons CC-BY 4.0 (attribution) licence (https://creativecommons.org/licenses/by/4.0/) and may thus be reproduced, translated, adapted and be the subject of derivative works provided the authors and Frontiers are attributed.

For Frontiers’ terms and conditions please see https://www.frontiersin.org/legal/terms-and-conditions.

Received: 09 Dec 2008; Published Online: 09 Dec 2008.

* Correspondence: Clara Suied, La Salpetrière Hospital, Paris, France, clara.suied@ircam.fr