Event Abstract

Modelling of the McGurk effect

  • 1 University of Surrey, Department of Computing, United Kingdom

The McGurk effect [1] is a perceptual illusion that occurs when a visual input of lip movements, such as /ga/, is combined with a different audio input, such as /ba/. For the incongruent input, people perceive a different sound from both visual and audio input, such as /da/.
Studies on speech perception show that lip reading activates the primary cortex [2], suggesting the importance of visual speech articulation in speech perception. Other studies illustrate that the auditory-visual speech perception is influenced by the listener’s phonological repertoire [3].
The current study investigates the McGurk effect by modelling with a neural network. A previous model of the McGurk effect used a symmetric feed-forward neural network [4], with two identical sub-systems corresponding to the auditory and visual processing systems, that receive the same information, the end-product of phonological interpretation.
In our approach, the simulation takes into account the influence of noise, one channel speech perception and language on auditory-visual speech perception. Several neural network structures have been used. The input and output patterns for the neural network replicate the phoneme and the viseme, the basic unit of speech in the visual domain. The results of the simulation show that noise does influence the appearance of the McGurk effect.


1. McGurk et al. 1976. Nature 264,746-748.

2. Pekkola et al. 2005. NeuroReport 16,125-128.

3. Kaoru Sekiyama et al. 1991. The Journal of the Acoustical Society of America 90,1797 -1805.

4. Skouras 2006. Proceeding of The 7th International Conference on Cognitive Modeling, 286-291.

Conference: 41st European Brain and Behaviour Society Meeting, Rhodes Island, Greece, 13 Sep - 18 Sep, 2009.

Presentation Type: Poster Presentation

Topic: Poster presentations

Citation: Sporea I and Gruning A (2009). Modelling of the McGurk effect. Conference Abstract: 41st European Brain and Behaviour Society Meeting. doi: 10.3389/conf.neuro.08.2009.09.307

Copyright: The abstracts in this collection have not been subject to any Frontiers peer review or checks, and are not endorsed by Frontiers. They are made available through the Frontiers publishing platform as a service to conference organizers and presenters.

The copyright in the individual abstracts is owned by the author of each abstract or his/her employer unless otherwise stated.

Each abstract, as well as the collection of abstracts, are published under a Creative Commons CC-BY 4.0 (attribution) licence (https://creativecommons.org/licenses/by/4.0/) and may thus be reproduced, translated, adapted and be the subject of derivative works provided the authors and Frontiers are attributed.

For Frontiers’ terms and conditions please see https://www.frontiersin.org/legal/terms-and-conditions.

Received: 15 Jun 2009; Published Online: 15 Jun 2009.

* Correspondence: Ioana Sporea, University of Surrey, Department of Computing, Guildford, United Kingdom, i.nica@surrey.ac.uk