MINI REVIEW article
Auditory and Visual Motion Processing and Integration in the Primate Cerebral Cortex
- 1Neuroscience Program, Biomedicine Discovery Institute and Department of Physiology, Monash University, Clayton, VIC, Australia
- 2Australian Research Council (ARC) Centre of Excellence for Integrative Brain Function, Monash University Node, Clayton, VIC, Australia
The ability of animals to detect motion is critical for survival, and errors or even delays in motion perception may prove costly. In the natural world, moving objects in the visual field often produce concurrent sounds. Thus, it can highly advantageous to detect motion elicited from sensory signals of either modality, and to integrate them to produce more reliable motion perception. A great deal of progress has been made in understanding how visual motion perception is governed by the activity of single neurons in the primate cerebral cortex, but far less progress has been made in understanding both auditory motion and audiovisual motion integration. Here we, review the key cortical regions for motion processing, focussing on translational motion. We compare the representations of space and motion in the visual and auditory systems, and examine how single neurons in these two sensory systems encode the direction of motion. We also discuss the way in which humans integrate of audio and visual motion cues, and the regions of the cortex that may mediate this process.
The natural world abounds with motion, making this a highly salient cue to guide animals in interacting with the environment. It is therefore not surprising that most, if not all brains have dedicated neural circuits for the perception of motion. In primates, the cerebral cortex contains a network of regions that are specialized for motion processing, but the systems for processing the motion of visual features and sounds are mediated by different brain regions, and underpinned by different physiological mechanisms. In this mini-review article, we will discuss the encoding of direction of motion in the visual and auditory systems, with emphasis on the cortical systems that are involved in translational motion, especially in azimuth (leftwards and rightwards motion), as this is the most common type of motion used in audiovisual integration studies.
Encoding of Direction of Motion in the Activity of Cortical Neurons
Spatial features are represented in fundamentally different ways in the visual and auditory systems. In the visual system, most neurons have spatially defined receptive fields, which are ultimately defined by inputs from specific regions of the retina. Therefore, the responses of neurons in the visual system are inherently capable of coding the spatial location of visual stimuli, and in theory, could encode direction of motion by the sequential activation of populations of neurons with different receptive field locations. However, the visual system goes one step further, with direction of motion being explicitly represented at the level of the single cell. Specifically, the spiking (action potential) responses of neurons are tuned to the direction of moving stimuli, meaning that they are more active in response to a specific direction of motion compared to other directions (Dubner and Zeki, 1971; Baker et al., 1981; Maunsell and Van Essen, 1983a; Albright, 1984; Desimone and Ungerleider, 1986; Saito et al., 1986; Tanaka and Saito, 1989; Chaplin et al., 2017). Thus, direction selective neurons in the visual system can encode the direction of motion within their receptive fields. For example, Figure 1A shows the response of a direction tuned neuron: the neuron shows strong responses to motion towards the upper left quadrant, and progressively weaker responses for directions further away.
Figure 1. Encoding of direction of motion in the visual and auditory systems. (A) A typical visual direction tuning curve from a neuron in the marmoset visual cortex (area MT) in response to a moving dot stimulus (data from Chaplin et al., 2017). The vertical line indicates the preferred direction of motion, and the inset shows the mean spiking responses (with the spontaneous rate subtracted) in polar plot form, showing clear direction selectivity. (B) The temporal spiking response of a neuron in the macaque auditory cortex (A1) in response to a moving auditory stimulus. Here, the difference in firing rate between two directions of motion is quite modest, and is most obvious in the later part of the response. Redrawn with permission from the authors of Ahissar et al. (1992). (C) Inflated model of the macaque cerebral cortex showing some of the motion processing areas in the primate cerebral cortex (Van Essen, 2002; Van Essen and Dierker, 2007). Light blue areas: visual areas where a subpopulation of neurons shows direction selectivity, dark blue areas: visual motion processing areas MT, MSTd and MSTl, orange: A1, red: areas of the caudal auditory belt (CM, CL) which have been implicated in auditory motion processing, purple: areas that show auditory and visual motion responses and may be involved in integrating the two modalities.
In contrast, most neurons in the auditory system respond to specific ranges of acoustic frequencies, since they ultimately receive inputs from defined regions of the cochlea. Thus, the auditory system needs to exploit other auditory cues to extract spatial information from the stimulus. The principal cues for locating sounds in the azimuth are binaural—interaural time differences (ITDs) and interaural level differences (ILDs; Middlebrooks and Green, 1991). Several brain regions are involved in the perception of sound location, and neurons in these regions can be tuned for ITDs or ILDs (Masterton et al., 1967; Rajan et al., 1990a,b; Semple and Kitzes, 1993a,b; Irvine et al., 1996; Tian et al., 2001; Woods et al., 2006; Miller and Recanzone, 2009; Grothe et al., 2010; Slee and Young, 2010; Kusmierek and Rauschecker, 2014; Keating and King, 2015; Lui et al., 2015; Mokri et al., 2015).
The encoding of the direction of auditory motion by the activity of single cortical neurons has not been studied extensively in primates—to our knowledge, there is only published study (Ahissar et al., 1992), in which they recorded spiking activity in the primary auditory cortex (A1) of monkeys. They found that while many cells (62%) in A1 code for the spatial location of stationary sounds, some cells (32%) also showed a preference for leftwards or rightwards direction of motion. However, the differences in responses were far less marked than those observed in direction selective cells in the visual system. There were only modest differences in firing rates, which were evident in the late part of the responses (Figure 1B). These results suggest that the encoding of the direction of motion of auditory stimuli is likely to be a much more distributed representation across a neuronal populations, compared to direction of motion encoding in the visual system (Cohen and Newsome, 2009), or that explicit encoding of auditory motion relies on other areas beyond A1.
Visual Motion Processing Areas
The neural circuits for visual motion processing are among the best understood aspects of the structure and function of the primate cerebral cortex (Figure 1C, blue areas). The primary visual cortex (V1) is the first stage of visual processing in the cerebral cortex in which direction selectivity first appears, but only a small proportion of V1 neurons are direction selective (~15%, Yu et al., 2010; Yu and Rosa, 2014; Davies et al., 2016). Direction selective neurons have been observed in several other visual areas (Orban et al., 1986; Desimone and Schein, 1987; Felleman and Van Essen, 1987; Lui et al., 2005, 2006; Orban, 2008; Fattori et al., 2009; Li et al., 2013), but it is the middle temporal (MT) and medial superior temporal (MST) areas that appear to be most specialized for motion processing. The vast majority of cells in these regions are direction selective (MT ~85%: Allman and Kaas, 1971; Dubner and Zeki, 1971; Maunsell and Van Essen, 1983b; Albright, 1984; MST ~90%: Desimone and Ungerleider, 1986; Saito et al., 1986; Tanaka and Saito, 1989; Celebrini and Newsome, 1994; Elston and Rosa, 1997). Furthermore, it is known that damage to MT and MST results in motion perception impairments (Newsome and Paré, 1988; Pasternak and Merigan, 1994; Orban et al., 1995; Schenk and Zihl, 1997; Rudolph and Pasternak, 1999), and electrical stimulation of these regions can influence the perception of motion (Celebrini and Newsome, 1994, 1995; Salzman and Newsome, 1994; Britten and Van Wezel, 2002; Nichols and Newsome, 2002; Fetsch et al., 2014). Thus, a causal relationship has been established between neural activity in MT and MST and the perception of visual motion.
MST can be divided to two subregions: a lateral part (MSTl) involved in the perception of moving objects and smooth pursuit eye movements (Komatsu and Wurtz, 1988a,b; Eifuku and Wurtz, 1998), and dorsal part (MSTd), which is associated with the perception of complex motion patterns (Graziano et al., 1994; Mineault et al., 2012), especially self-motion (Saito et al., 1986; Komatsu and Wurtz, 1988a; Duffy and Wurtz, 1991; Duffy, 1998), and has a well described role in the integration of visual and vestibular motion cues (Gu et al., 2007, 2008). Differences between MT and MST have been well studied in monkeys, but in human studies these areas are typically grouped into a single region called the human MT complex (hMT+, Zeki et al., 1991; Huk et al., 2002), due to the spatial resolution limits of fMRI.
Auditory Motion Processing Areas
In comparison to the visual system, the regions and circuitry of the cortex involved in auditory motion processing are not as well characterized (Figure 1C). While there is some evidence for motion sensitivity and direction selectivity in the A1 (Ahissar et al., 1992; Griffiths et al., 2000; Lewis et al., 2000), many human imaging studies have identified the planum temporale, a region of auditory cortex caudal to primary cortex, as being the key site for auditory motion processing (Baumgart et al., 1999; Pavani et al., 2002; Warren et al., 2002; Alink et al., 2012b). In agreement with these findings, a recent imaging study in macaques also found that the caudal regions of auditory cortex are differentially activated by auditory motion compared to stationary stimuli (Poirier et al., 2017). Furthermore, studies of humans with lesions to caudal auditory cortex have found deficits in auditory motion processing (Ducommun et al., 2004; Lewald et al., 2009; Thaler et al., 2016).
It remains controversial whether auditory motion perception relies on specialized motion detectors, similar to direction selective cells in the visual cortex (Perrott and Musicant, 1977), or utilizes “snapshots” of the current sound source location (Ahissar et al., 1992; Poirier et al., 2017), as several human imaging studies have reported there is no difference in cortical activation between stationary and moving stimuli (Smith et al., 2004, 2007; Krumbholz et al., 2005, 2007). Since neurons in the auditory system show sensitivity to localization cues (e.g., ITDs and ILDs), the perception of motion could be mediated by the sequential activation of neurons that code for adjacent spatial locations (Ahissar et al., 1992). In general, in the auditory system the integration of binaural cues for sound localization occurs at early subcortical stages of processing, such as the superior olivary complex, the nuclei of the lateral lemniscus and the inferior colliculus (Moore, 1991). In monkeys, the caudal part of auditory cortex encompasses the caudomedial (CM) and caudolateral (CL) areas of the auditory belt (Hackett et al., 1998; Kaas et al., 1999), and these are known to play a role in the localization of auditory stimuli (Recanzone et al., 2000; Tian et al., 2001; Woods et al., 2006; Miller and Recanzone, 2009; Kusmierek and Rauschecker, 2014). Therefore, the sensitivity of neurons in these areas to the location of static stimuli is a potential confound in auditory motion studies, as it can be difficult to distinguish true motion sensitivity from sensitivity to spatial location. For example, it has been suggested that apparent sensitivities to motion in the inferior colliculus could be explained by adaptation to stationary stimuli, which would result in reduced spiking activity for stationary stimuli compared to moving stimuli (Ahissar et al., 1992; Wilson and O’Neill, 1998; McAlpine et al., 2000; Ingham et al., 2001; Poirier et al., 2017). However, the recent imaging study by Poirier et al. (2017) did take steps to control for this effect in their choice of stimuli and regressions analyses, and still found that the caudal auditory cortex was differentially activated by auditory motion compared to static motion. Further electrophysiological studies in monkeys will be required to address the question of how auditory motion is encoded by the spiking activity of neurons in these regions.
The neural representation of auditory motion does not necessarily have to be located in purely auditory regions. Direct reciprocal connections between MT/MST and the auditory cortex have been identified in primates (Palmer and Rosa, 2006), and two recent electrophysiological studies (Chaplin et al., 2018; Kafaligonul et al., 2018) have reported evoked potentials in areas MT/MST in response to stationary auditory clicks. Two human imaging studies have reported that the hMT+ complex responds to auditory motion (Poirier et al., 2005; Strnad et al., 2013), but it has also been argued that observed auditory responses in hMT+ could be explained by localization errors (Jiang et al., 2014), and no study has found any evidence for spiking activity in response to auditory stimuli (moving or stationary) in the monkey MT complex. Furthermore, a case study of involving lesions of hMT+ did not find any impairment in the perception of auditory motion (Zihl et al., 1983). Thus, current evidence suggests that MT and MST are not involved in auditory motion processing.
Integration of Auditory and Visual Motion Cues
Given the differences in the neural representation of motion in the auditory and visual systems, it is interesting to consider how the information from the two modalities could be combined to improve motion perception. Psychophysical studies have investigated audiovisual motion integration in humans using motion detection tasks, and have provided valuable insights into how auditory and visual motion can be integrated in the brain. Some of these studies have reported that humans perform better in audiovisual motion tasks compared to unimodal tasks, but there is disagreement as to whether this increase in performance is “statistically optimal” or the result of “probability summation.” When probability summation occurs, observers perform better on bimodal trials because they essentially have two chances to answer correctly—using either the visual or the auditory cue (Wuerger et al., 2003; Alais and Burr, 2004). When statistically optimal integration occurs, observers combine the information obtained by the different senses by weighting according to their reliability, to make optimal use of the information available (Meyer and Wuerger, 2001). Therefore, statistically optimal integration exceeds the performance of probability summation. Multisensory integration has shown be statistically optimal in other contexts (Ernst and Banks, 2002; Angelaki et al., 2009; Fetsch et al., 2009; Drugowitsch et al., 2014; Rohde et al., 2016).
It has been argued that statistically optimal integration of multisensory cues relies on neural computations occurring in early sensory cortex (e.g., MT/MST), rather than in higher-level areas (Ma et al., 2006; Beck et al., 2008; Bizley et al., 2016). In contrast, when multisensory integration is the result of probability summation, it may rely on higher-order areas (e.g., prefrontal or posterior parietal cortex, Alais and Burr, 2004; Bizley et al., 2016).
Audiovisual Motion Integration in the Primate Cerebral Cortex
Human imaging studies and monkey electrophysiological/anatomical studies have suggested several candidate cortical regions for the integration of audiovisual motion. The human superior temporal sulcus is typically activated by moving audiovisual stimuli (Lewis et al., 2000; Baumann and Greenlee, 2007; von Saldern and Noppeney, 2013). This region likely corresponds to the superior temporal polysensory (STP) area of macaques (Bruce et al., 1981), and the presence of multisensory neurons in STP is well known (Bruce et al., 1981; Hikosaka et al., 1988; Watanabe and Iwai, 1991). STP is typically associated with processing more complex visual and auditory signals, such as faces and speech (Beauchamp, 2005) and biological motion (Oram and Perrett, 1994; Barraclough et al., 2005), especially in complex tasks (Meyer et al., 2011; Wuerger et al., 2012), but there is evidence of subregional specializations (Padberg et al., 2003).
The posterior parietal cortex may also be important for audiovisual motion integration, as areas in this complex have found to be active during audiovisual stimulation in humans (Baumann and Greenlee, 2007; Wuerger et al., 2012), and is thought play a key role in coordinating multisensory integration (Brang et al., 2013). Cells in the ventral intraparietal area (VIP) are known to respond to both visual motion (Cook and Maunsell, 2002; Kaminiarz et al., 2014) and auditory stimuli (Bremmer et al., 2001; Schlack et al., 2005). The lateral intraparietal area (LIP) has been demonstrated to be involved in the integration visual motion signals over time to form perceptual decisions (Roitman and Shadlen, 2002), and also responds to auditory stimulation (Grunewald et al., 1999; Linden et al., 1999). Therefore, it is possible that LIP could integration information from both senses, by preforming similar computations.
Integration could also occur at the level of the prefrontal cortex (PFC), as regions in the dorsolateral PFC (areas 8a, 45 and 46) are known to receive inputs from MT and MST (Lewis and Van Essen, 2000; Reser et al., 2013) as well as caudal auditory cortex (Romanski et al., 1999a,b). Furthermore, direction selective responses to visual motion have been demonstrated in this region (Zaksas and Pasternak, 2006), and like LIP, PFC neurons show activity that is consistent with accumulating sensory evidence to form perceptual decisions (Kim and Shadlen, 1999). Cells in the ventrolateral subdivision of the PFC, such as area 12, have been shown to integrate audiovisual cues, but like STP, are generally associated with higher level sensory processing, responding to individual faces and calls (Romanski, 2007, 2012). However, human imaging studies of audiovisual motion have generally not reported comparable activation in the PFC (Lewis et al., 2000; Baumann and Greenlee, 2007; von Saldern and Noppeney, 2013), although audiovisual biological motion can modulate activity in premotor areas (areas 6R and 44) when there is a mismatch between the auditory and visual cues (Wuerger et al., 2012).
A number of imaging studies have also found that audiovisual stimulation produces distinct activation (compared to visual only stimulation) in hMT+ (Alink et al., 2008; Lewis and Noppeney, 2010; Strnad et al., 2013; von Saldern and Noppeney, 2013), suggesting that auditory stimuli can modulate visually evoked responses (although this is not always the case, e.g., Wuerger et al., 2012). These regions receive sparse inputs from auditory cortex (Palmer and Rosa, 2006), and show evoked potentials in response to auditory stimuli (Chaplin et al., 2018; Kafaligonul et al., 2018). Additionally, auditory motion has been shown to affect various aspects of visual perception, such as improving visual motion detection (Kim et al., 2012), improve learning in visual motion tasks (Seitz et al., 2006), and induce visual illusions (Sekuler et al., 1997; Meyer and Wuerger, 2001; Kitagawa and Ichihara, 2002; Beer and Röder, 2004; Soto-Faraco et al., 2005; Freeman and Driver, 2008; Alink et al., 2012a; Kafaligonul and Stoner, 2012; Kafaligonul and Oluk, 2015). Altogether, these studies suggest that auditory stimuli, especially when moving, could modulate responses to visual stimuli in MT/MST.
To specifically test this hypothesis, we have investigated if auditory motion cues are integrated with visual motion cues in MT/MST, by recording spiking activity and characterizing the ability of neurons to encode the direction of motion, using ideal observer analysis (Chaplin et al., 2018). We presented random dot patterns that moved either leftwards or rightwards, and manipulated the strength of the visual motion signal by reducing the coherence of the dots (i.e., making some proportion of the dots move in random directions). Reducing motion coherence reduces the both the psychophysical performance of observers (i.e., makes it more difficult to discriminate the directions of motion) and the neurometric performance of single neurons (i.e., reduces the neuronal information; Newsome et al., 1989). We hypothesized that the addition of an auditory stimulus that moved in the same direction as the visual stimulus would increase the information carried by single neurons and therefore increase neurometric performance, just as it can increase psychophysical performance in humans (Meyer and Wuerger, 2001; Kim et al., 2012). In particular, we predicted that auditory cues would be most likely be integrated at low motion coherence levels, in line with Bayesian models of multisensory integration (Ernst and Banks, 2002; Ma et al., 2006; Gu et al., 2008). However, we found no evidence of spike rate modulations (Figure 2A) or improvements in neurometric performance (Figure 2B) due to the auditory stimulus, in MT or MST. It may be the case that the audiovisual responses observed in hMT+ are the result of task related signals (Alink et al., 2012b; Bizley et al., 2016; Kayser et al., 2017), such as the binding of the two modalities to form a unified percept (Nahorna et al., 2012, 2015; Bizley and Cohen, 2013), attentional effects (Beer and Röder, 2004, 2005; Lakatos et al., 2008), or choice-related signals from the decision making process (Cumming and Nienborg, 2016).
Figure 2. (A) Responses of a marmoset MT neuron to visual, auditory and audiovisual stimuli. The raster plots (black dots) and spike rate functions (colored lines) show a clear response to visual but not auditory stimuli (blue vs. green lines). The combination of auditory and visual stimuli (red line) was not significantly different to the visual only response (blue vs. red lines). (B) Neurometric performance (measured as the area under the receiver operating characteristic (ROC) curve, Britten et al., 1992, which corresponds to the performance of an ideal observer discriminating the direction of motion using the spiking activity of the neuron) of a marmoset MT neuron when discriminating leftwards and rightwards motion under visual (blue) and audiovisual (red) conditions at different levels of motion coherence (strength of motion signal). The addition of the auditory stimulus did not shift the neurometric curve to the left as would be expected if the neuron was integrating the auditory motion cue (adapted from Chaplin et al., 2018).
Only one other study has investigated the effects of auditory stimuli on the responses of MT neurons (Kafaligonul et al., 2018). This study aimed to test if the activity of MT neurons mediated the temporal ventriloquist illusion, in which stationary auditory clicks induce influence the perception of visual speed. The authors hypothesized that the auditory clicks would alter the speed tuning and response duration of MT neurons in response to apparent visual motion. However, the auditory stimuli did not alter speed tuning or response duration in a way that would support the perception of the illusion, even though there was a possible modulation of the temporal spiking response. Therefore, electrophysiological studies in monkeys so far suggest that auditory stimuli do not influence visual motion perception through changes in activity to MT/MST neurons. However, since the projections from auditory to visual cortex are known to arrive at the peripheral representation of the visual field (Palmer and Rosa, 2006; Majka et al., 2018), it possible that their role of auditory inputs to facilitate the detection and localization of visual features, especially for orienting (Perrott et al., 1993; Wang et al., 2008).
In conclusion, the processing of auditory and visual motion in the primate cerebral cortex utilizes different brain areas and physiological mechanisms. While good progress has been made in identifying the cortical regions involved in processing auditory and audiovisual motion, the mechanisms of audiovisual integration remain unclear. The current evidence from single neuron studies suggests that the integration of auditory and visual motion cues is not mediated by the early visual areas MT and MST, and therefore such integration likely occurs in higher level cortical areas. Another possibility is that the integration of audiovisual motion signals is not mediated by a single brain region, but instead by synchronized network activity (Lewis and Noppeney, 2010).
TC wrote the first draft of the manuscript. MR and LL wrote sections of the manuscript. All authors contributed to manuscript revision, read and approved the submitted version.
This project was funded by the Australian Research Council (DE130100493 to LL; CE140100007 to MR) and by the National Health and Medical Research Council of Australia (APP1066232 to LL, APP1083152 to MR and APP1159764 to TC). TC was funded by an Australian Postgraduate Award and a Monash University Faculty of Medicine Bridging Postdoctoral Fellowship.
Conflict of Interest Statement
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
We thank Ramesh Rajan for advice on the manuscript and Merav Ahissar for giving permission for Figure 1B.
Ahissar, M., Ahissar, E., Bergman, H., and Vaadia, E. (1992). Encoding of sound-source location and movement: activity of single neurons and interactions between adjacent neurons in the monkey auditory cortex. J. Neurophysiol. 67, 203–215. doi: 10.1152/jn.19126.96.36.199
Alink, A., Euler, F., Kriegeskorte, N., Singer, W., and Kohler, A. (2012b). Auditory motion direction encoding in auditory cortex and high-level visual cortex. Hum. Brain Mapp. 33, 969–978. doi: 10.1002/hbm.21263
Alink, A., Singer, W., and Muckli, L. (2008). Capture of auditory motion by vision is represented by an activation shift from auditory to visual motion cortex. J. Neurosci. 28, 2690–2697. doi: 10.1523/jneurosci.2980-07.2008
Allman, J. M., and Kaas, J. H. (1971). A representation of the visual field in the caudal third of the middle temporal gyrus of the owl monkey (Aotus trivirgatus). Brain Res. 31, 85–105. doi: 10.1016/0006-8993(71)90635-4
Baker, J. F., Petersen, S. E., Newsome, W. T., and Allman, J. M. (1981). Visual response properties of neurons in four extrastriate visual areas of the owl monkey (Aotus trivirgatus): a quantitative comparison of medial, dorsomedial, dorsolateral and middle temporal areas. J. Neurophysiol. 45, 397–416. doi: 10.1152/jn.19188.8.131.527
Barraclough, N. E., Xiao, D., Baker, C. I., Oram, M. W., and Perrett, D. I. (2005). Integration of visual and auditory information by superior temporal sulcus neurons responsive to the sight of actions. J. Cogn. Neurosci. 17, 377–391. doi: 10.1162/0898929053279586
Beck, J. M., Ma, W. J., Kiani, R., Hanks, T., Churchland, A. K., Roitman, J. D., et al. (2008). Probabilistic population codes for Bayesian decision making. Neuron 60, 1142–1152. doi: 10.1016/j.neuron.2008.09.021
Beer, A. L., and Röder, B. (2005). Attending to visual or auditory motion affects perception within and across modalities: an event-related potential study. Eur. J. Neurosci. 21, 1116–1130. doi: 10.1111/j.1460-9568.2005.03927.x
Brang, D., Taich, Z. J., Hillyard, S. A., Grabowecky, M., and Ramachandran, V. S. (2013). Parietal connectivity mediates multisensory facilitation. Neuroimage 78, 396–401. doi: 10.1016/j.neuroimage.2013.04.047
Bremmer, F., Schlack, A., Shah, N. J., Zafiris, O., Kubischik, M., Hoffmann, K., et al. (2001). Polymodal motion processing in posterior parietal and premotor cortex: a human fMRI study strongly implies equivalencies between humans and monkeys. Neuron 29, 287–296. doi: 10.1016/s0896-6273(01)00198-2
Britten, K. H., Shadlen, M. N., Newsome, W. T., and Movshon, J. A. (1992). The analysis of visual motion: a comparison of neuronal and psychophysical performance. J. Neurosci. 12, 4745–4765. doi: 10.1523/jneurosci.12-12-04745.1992
Bruce, C., Desimone, R., and Gross, C. G. (1981). Visual properties of neurons in a polysensory area in superior temporal sulcus of the macaque. J. Neurophysiol. 46, 369–384. doi: 10.1152/jn.19184.108.40.2069
Celebrini, S., and Newsome, W. T. (1994). Neuronal and psychophysical sensitivity to motion signals in extrastriate area MST of the macaque monkey. J. Neurosci. 14, 4109–4124. doi: 10.1523/jneurosci.14-07-04109.1994
Celebrini, S., and Newsome, W. T. (1995). Microstimulation of extrastriate area MST influences performance on a direction discrimination task. J. Neurophysiol. 73, 437–448. doi: 10.1152/jn.19220.127.116.117
Chaplin, T. A., Allitt, B. J., Hagan, M. A., Price, N. S. C., Rajan, R., Rosa, M. G. P., et al. (2017). Sensitivity of neurons in the middle temporal area of marmoset monkeys to random dot motion. J. Neurophysiol. 118, 1567–1580. doi: 10.1101/104117
Chaplin, T. A., Allitt, B. J., Hagan, M. A., Rosa, M. G. P., Rajan, R., and Lui, L. L. (2018). Auditory motion does not modulate spiking activity in the middle temporal and medial superior temporal visual areas. Eur. J. Neurosci. 48, 2013–2029. doi: 10.1111/ejn.14071
Cohen, M. R., and Newsome, W. T. (2009). Estimates of the contribution of single neurons to perception depend on timescale and noise correlation. J. Neurosci. 29, 6635–6648. doi: 10.1523/jneurosci.5179-08.2009
Dubner, R., and Zeki, S. M. (1971). Response properties and receptive fields of cells in an anatomically defined region of the superior temporal sulcus in the monkey. Brain Res. 35, 528–532. doi: 10.1016/0006-8993(71)90494-x
Duffy, C. J., and Wurtz, R. H. (1991). Sensitivity of MST neurons to optic flow stimuli. II. Mechanisms of response selectivity revealed by small-field stimuli. J. Neurophysiol. 65, 1346–1359. doi: 10.1152/jn.1918.104.22.1686
Elston, G. N., and Rosa, M. G. (1997). The occipitoparietal pathway of the macaque monkey: comparison of pyramidal cell morphology in layer III of functionally related cortical visual areas. Cereb. Cortex 7, 432–452. doi: 10.1093/cercor/7.5.432
Fetsch, C. R., Turner, A. H., DeAngelis, G. C., and Angelaki, D. E. (2009). Dynamic reweighting of visual and vestibular cues during self-motion perception. J. Neurosci. 29, 15601–15612. doi: 10.1523/JNEUROSCI.2574-09.2009
Griffiths, T. D., Green, G. G., Rees, A., and Rees, G. (2000). Human brain areas involved in the analysis of auditory movement. Hum. Brain Mapp. 9, 72–80. doi: 10.1002/(sici)1097-0193(200002)9:2<72::aid-hbm2>3.0.co;2-9
Grunewald, A., Linden, J. F., and Andersen, R. A. (1999). Responses to auditory stimuli in macaque lateral intraparietal area I. Effects of training. J. Neurophysiol. 82, 330–342. doi: 10.1152/jn.1922.214.171.1240
Hackett, T. A., Stepniewska, I., and Kaas, J. H. (1998). Subdivisions of auditory cortex and ipsilateral cortical connections of the parabelt auditory cortex in macaque monkeys. J. Comp. Neurol. 394, 475–495. doi: 10.1002/(sici)1096-9861(19980518)394:4<475::aid-cne6>3.0.co;2-z
Hikosaka, K., Iwai, E., Saito, H., and Tanaka, K. (1988). Polysensory properties of neurons in the anterior bank of the caudal superior temporal sulcus of the macaque monkey. J. Neurophysiol. 60, 1615–1637. doi: 10.1152/jn.19126.96.36.1995
Ingham, N. J., Hart, H. C., and McAlpine, D. (2001). Spatial receptive fields of inferior colliculus neurons to auditory apparent motion in free field. J. Neurophysiol. 85, 23–33. doi: 10.1152/jn.2001.85.1.23
Irvine, D. R., Rajan, R., and Aitkin, L. M. (1996). Sensitivity to interaural intensity differences of neurons in primary auditory cortex of the cat. I. types of sensitivity and effects of variations in sound pressure level. J. Neurophysiol. 75, 75–96. doi: 10.1152/jn.19188.8.131.52
Kafaligonul, H., Albright, T. D., and Stoner, G. R. (2018). Auditory modulation of spiking activity and local field potentials in area MT does not appear to underlie an audiovisual temporal illusion. J. Neurophysiol. 120, 1340–1355. doi: 10.1152/jn.00835.2017
Kaminiarz, A., Schlack, A., Hoffmann, K.-P., Lappe, M., and Bremmer, F. (2014). Visual selectivity for heading in the macaque ventral intraparietal area. J. Neurophysiol. 112, 2470–2480. doi: 10.1152/jn.00410.2014
Kayser, S. J., Philiastides, M. G., and Kayser, C. (2017). Sounds facilitate visual motion discrimination via the enhancement of late occipital visual representations. Neuroimage 148, 31–41. doi: 10.1016/j.neuroimage.2017.01.010
Komatsu, H., and Wurtz, R. H. (1988a). Relation of cortical areas MT and MST to pursuit eye movements. I. Localization and visual properties of neurons. J. Neurophysiol. 60, 580–603. doi: 10.1152/jn.19184.108.40.2060
Komatsu, H., and Wurtz, R. H. (1988b). Relation of cortical areas MT and MST to pursuit eye movements. III. Interaction with full-field visual stimulation. J. Neurophysiol. 60, 621–644. doi: 10.1152/jn.19220.127.116.111
Krumbholz, K., Hewson-Stoate, N., and Schönwiesner, M. (2007). Cortical response to auditory motion suggests an asymmetry in the reliance on inter-hemispheric connections between the left and right auditory cortices. J. Neurophysiol. 97, 1649–1655. doi: 10.1152/jn.00560.2006
Krumbholz, K., Schönwiesner, M., Rübsamen, R., Zilles, K., Fink, G. R., and Von Cramon, D. Y. (2005). Hierarchical processing of sound location and motion in the human brainstem and planum temporale. Eur. J. Neurosci. 21, 230–238. doi: 10.1111/j.1460-9568.2004.03836.x
Kusmierek, P., and Rauschecker, J. P. (2014). Selectivity for space and time in early areas of the auditory dorsal stream in the rhesus monkey. J. Neurophysiol. 111, 1671–1685. doi: 10.1152/jn.00436.2013
Lakatos, P., Karmos, G., Mehta, A. D., Ulbert, I., and Schroeder, C. E. (2008). Entrainment of neuronal oscillations as a mechanism of attentional selection. Science 320, 110–113. doi: 10.1126/science.1154735
Lewald, J., Peters, S., Corballis, M. C., and Hausmann, M. (2009). Perception of stationary and moving sound following unilateral cortectomy. Neuropsychologia 47, 962–971. doi: 10.1016/j.neuropsychologia.2008.10.016
Lewis, R., and Noppeney, U. (2010). Audiovisual synchrony improves motion discrimination via enhanced connectivity between early visual and auditory areas. J. Neurosci. 30, 12329–12339. doi: 10.1523/JNEUROSCI.5745-09.2010
Lewis, J. W., and Van Essen, D. C. (2000). Corticocortical connections of visual, sensorimotor and multimodal processing areas in the parietal lobe of the macaque monkey. J. Comp. Neurol. 428, 112–137. doi: 10.1002/1096-9861(20001204)428:1<112::aid-cne8>3.0.co;2-9
Linden, J. F., Grunewald, A., Andersen, R. A., and Modulation, I. I. B. (1999). Responses to auditory stimuli in macaque lateral intraparietal area II. Behavioral modulation. J. Neurophysiol. 82, 343–358. doi: 10.1152/jn.1918.104.22.1683
Lui, L. L., Bourne, J. A., and Rosa, M. G. P. (2005). Single-unit responses to kinetic stimuli in New World monkey area V2: physiological characteristics of cue-invariant neurones. Exp. Brain Res. 162, 100–108. doi: 10.1007/s00221-004-2113-9
Lui, L. L., Bourne, J. A., and Rosa, M. G. P. (2006). Functional response properties of neurons in the dorsomedial visual area of New World monkeys (Callithrix jacchus). Cereb. Cortex 16, 162–177. doi: 10.1093/cercor/bhi094
Lui, L. L., Mokri, Y., Reser, D. H., Rosa, M. G. P., and Rajan, R. (2015). Responses of neurons in the marmoset primary auditory cortex to interaural level differences: comparison of pure tones and vocalizations. Front. Neurosci. 9:132. doi: 10.3389/fnins.2015.00132
Majka, P., Rosa, M. G. P., Bai, S., Chan, J. M., Huo, B.-X., Jermakow, N., et al. (2018). Unidirectional monosynaptic connections from auditory areas to the primary visual cortex in the marmoset monkey. Brain Struct. Funct. doi: 10.1007/s00429-018-1764-4 [Epub ahead of print].
Maunsell, J. H. R., and Van Essen, D. C. (1983a). Functional properties of neurons in middle temporal visual area of the macaque monkey. I. Selectivity for stimulus direction, speed, and orientation. J. Neurophysiol. 49, 1127–1147. doi: 10.1152/jn.1922.214.171.1247
Maunsell, J. H. R., and Van Essen, D. C. (1983b). Functional properties of neurons in middle temporal visual area of the macaque monkey. II. Binocular interactions and sensitivity to binocular disparity. J. Neurophysiol. 49, 1148–1167. doi: 10.1152/jn.19126.96.36.1998
McAlpine, D., Jiang, D., Shackleton, T. M., and Palmer, A. R. (2000). Responses of neurons in the inferior colliculus to dynamic interaural phase cues: evidence for a mechanism of binaural adaptation. J. Neurophysiol. 83, 1356–1365. doi: 10.1152/jn.2000.83.3.1356
Meyer, G. F., Greenlee, M., and Wuerger, S. M. (2011). Interactions between auditory and visual semantic stimulus classes: evidence for common processing networks for speech and body actions. J. Cogn. Neurosci. 23, 2291–2308. doi: 10.1162/jocn.2010.21593
Miller, L. M., and Recanzone, G. H. (2009). Populations of auditory cortical neurons can accurately encode acoustic space across stimulus intensity. Proc. Natl. Acad. Sci. U S A 106, 5931–5935. doi: 10.1073/pnas.0901023106
Mineault, P. J., Khawaja, F. A., Butts, D. A., and Pack, C. C. (2012). Hierarchical processing of complex motion along the primate dorsal visual pathway. Proc. Natl. Acad. Sci. U S A 109, E972–E980. doi: 10.1073/pnas.1115685109
Mokri, Y., Worland, K., Ford, M., and Rajan, R. (2015). Effect of background noise on neuronal coding of interaural level difference cues in rat inferior colliculus. Eur. J. Neurosci. 42, 1685–1704. doi: 10.1111/ejn.12914
Nahorna, O., Berthommier, F., and Schwartz, J.-L. (2015). Audio-visual speech scene analysis: characterization of the dynamics of unbinding and rebinding the McGurk effect. J. Acoust. Soc. Am. 137, 362–377. doi: 10.1121/1.4904536
Newsome, W. T., and Paré, E. B. (1988). A selective impairment of motion perception following lesions of the middle temporal visual area (MT). J. Neurosci. 8, 2201–2211. doi: 10.1523/jneurosci.08-06-02201.1988
Nichols, M. J., and Newsome, W. T. (2002). Middle temporal visual area microstimulation influences veridical judgments of motion direction. J. Neurosci. 22, 9530–9540. doi: 10.1523/jneurosci.22-21-09530.2002
Orban, G. A., Kennedy, H., and Bullier, J. (1986). Velocity sensitivity and direction selectivity of neurons in areas V1 and V2 of the monkey: influence of eccentricity. J. Neurophysiol. 56, 462–480. doi: 10.1152/jn.19188.8.131.522
Orban, G. A., Saunders, R. C., and Vandenbussche, E. (1995). Lesions of the superior temporal cortical motion areas impair speed discrimination in the macaque monkey. Eur. J. Neurosci. 7, 2261–2276. doi: 10.1111/j.1460-9568.1995.tb00647.x
Padberg, J., Seltzer, B., and Cusick, C. G. (2003). Architectonics and cortical connections of the upper bank of the superior temporal sulcus in the rhesus monkey: an analysis in the tangential plane. J. Comp. Neurol. 467, 418–434. doi: 10.1002/cne.10932
Palmer, S. M., and Rosa, M. G. P. (2006). A distinct anatomical network of cortical areas for analysis of motion in far peripheral vision. Eur. J. Neurosci. 24, 2389–2405. doi: 10.1111/j.1460-9568.2006.05113.x
Pavani, F., Macaluso, E., Warren, J., Driver, J., and Griffiths, T. (2002). A common cortical substrate activated by horizontal and vertical sound movement in the human brain. Curr. Biol. 12, 1584–1590. doi: 10.1016/s0960-9822(02)01143-0
Poirier, C., Baumann, S., Dheerendra, P., Joly, O., Hunter, D., Balezeau, F., et al. (2017). Auditory motion-specific mechanisms in the primate brain. PLoS Biol. 15:e2001379. doi: 10.1371/journal.pbio.2001379
Poirier, C., Collignon, O., DeVolder, A. G., Renier, L., Vanlierde, A., Tranduy, D., et al. (2005). Specific activation of the V5 brain area by auditory motion processing: an fMRI study. Cogn. Brain Res. 25, 650–658. doi: 10.1016/j.cogbrainres.2005.08.015
Rajan, R., Aitkin, L. M., and Irvine, D. R. (1990a). Azimuthal sensitivity of neurons in primary auditory cortex of cats. II. Organization along frequency-band strips. J. Neurophysiol. 64, 888–902. doi: 10.1152/jn.19184.108.40.2068
Rajan, R., Aitkin, L. M., Irvine, D. R., and McKay, J. (1990b). Azimuthal sensitivity of neurons in primary auditory cortex of cats. I. Types of sensitivity and the effects of variations in stimulus parameters. J. Neurophysiol. 64, 872–887. doi: 10.1152/jn.19220.127.116.112
Recanzone, G. H., Guard, D. C., Phan, M. L., and Su, T. K. (2000). Correlation between the activity of single auditory cortical neurons and sound-localization behavior in the macaque monkey. J. Neurophysiol. 83, 2723–2739. doi: 10.1152/jn.2000.83.5.2723
Reser, D. H., Burman, K. J., Yu, H.-H., Chaplin, T. A., Richardson, K. E., Worthy, K. H., et al. (2013). Contrasting patterns of cortical input to architectural subdivisions of the area 8 complex: a retrograde tracing study in marmoset monkeys. Cereb. Cortex 23, 1901–1922. doi: 10.1093/cercor/bhs177
Roitman, J. D., and Shadlen, M. N. (2002). Response of neurons in the lateral intraparietal area during a combined visual discrimination reaction time task. J. Neurosci. 22, 9475–9489. doi: 10.1523/jneurosci.22-21-09475.2002
Romanski, L. M. (2012). Integration of faces and vocalizations in ventral prefrontal cortex: implications for the evolution of audiovisual speech. Proc. Natl. Acad. Sci. U S A 109, 10717–10724. doi: 10.1073/pnas.1204335109
Romanski, L. M., Bates, J. F., and Goldman-Rakic, P. S. (1999a). Auditory belt and parabelt projections to the prefrontal cortex in the rhesus monkey. J. Comp. Neurol. 403, 141–157. doi: 10.1002/(sici)1096-9861(19990111)403:2<141::aid-cne1>3.0.co;2-v
Romanski, L. M., Tian, B., Fritz, J., Mishkin, M., Goldman-Rakic, P. S., and Rauschecker, J. P. (1999b). Dual streams of auditory afferents target multiple domains in the primate prefrontal cortex. Nat. Neurosci. 2, 1131–1136. doi: 10.1038/16056
Rudolph, K., and Pasternak, T. (1999). Transient and permanent deficits in motion perception after lesions of cortical areas MT and MST in the macaque monkey. Cereb. Cortex 9, 90–100. doi: 10.1093/cercor/9.1.90
Saito, H. A., Yukie, M., Tanaka, K., Hikosaka, K., Fukada, Y., and Iwai, E. (1986). Integration of direction signals of image motion in the superior temporal sulcus of the macaque monkey. J. Neurosci. 6, 145–157. doi: 10.1523/jneurosci.06-01-00145.1986
Schlack, A., Sterbing-D’Angelo, S. J., Hartung, K., Hoffmann, K.-P., and Bremmer, F. (2005). Multisensory space representations in the macaque ventral intraparietal area. J. Neurosci. 25, 4616–4625. doi: 10.1523/JNEUROSCI.0455-05.2005
Semple, M. N., and Kitzes, L. M. (1993a). Binaural processing of sound pressure level in cat primary auditory cortex: evidence for a representation based on absolute levels rather than interaural level differences. J. Neurophysiol. 69, 449–461. doi: 10.1152/jn.1918.104.22.1689
Semple, M. N., and Kitzes, L. M. (1993b). Focal selectivity for binaural sound pressure level in cat primary auditory cortex: two-way intensity network tuning. J. Neurophysiol. 69, 462–473. doi: 10.1152/jn.1922.214.171.1242
Smith, K. R., Saberi, K., and Hickok, G. (2007). An event-related fMRI study of auditory motion perception: no evidence for a specialized cortical system. Brain Res. 1150, 94–99. doi: 10.1016/j.brainres.2007.03.003
Strnad, L., Peelen, M. V., Bedny, M., and Caramazza, A. (2013). Multivoxel pattern analysis reveals auditory motion information in MT+ of both congenitally blind and sighted individuals. PLoS One 8:e63198. doi: 10.1371/journal.pone.0063198
Tanaka, K., and Saito, H. A. (1989). Analysis of motion of the visual field by direction, expansion/contraction, and rotation cells clustered in the dorsal part of the medial superior temporal area of the macaque monkey. J. Neurophysiol. 62, 626–641. doi: 10.1152/jn.19126.96.36.1996
Thaler, L., Paciocco, J., Daley, M., Lesniak, G. D., Purcell, D. W., Fraser, J. A., et al. (2016). A selective impairment of perception of sound motion direction in peripheral space: a case study. Neuropsychologia 80, 79–89. doi: 10.1016/j.neuropsychologia.2015.11.008
von Saldern, S., and Noppeney, U. (2013). Sensory and striatal areas integrate auditory and visual signals into behavioral benefits during motion discrimination. J. Neurosci. 33, 8841–8849. doi: 10.1523/jneurosci.3020-12.2013
Wang, Y., Celebrini, S., Trotter, Y., and Barone, P. (2008). Visuo-auditory interactions in the primary visual cortex of the behaving monkey: electrophysiological evidence. BMC Neurosci. 9:79. doi: 10.1186/1471-2202-9-79
Watanabe, J., and Iwai, E. (1991). Neuronal activity in visual, auditory and polysensory areas in the monkey temporal cortex during visual fixation task. Brain Res. Bull. 26, 583–592. doi: 10.1016/0361-9230(91)90099-6
Wilson, W. W., and O’Neill, W. E. (1998). Auditory motion induces directionally dependent receptive field shifts in inferior colliculus neurons. J. Neurophysiol. 79, 2040–2062. doi: 10.1152/jn.19188.8.131.520
Woods, T. M., Lopez, S. E., Long, J. H., Rahman, J. E., and Recanzone, G. H. (2006). Effects of stimulus azimuth and intensity on the single-neuron activity in the auditory cortex of the alert macaque monkey. J. Neurophysiol. 96, 3323–3337. doi: 10.1152/jn.00392.2006
Wuerger, S. M., Parkes, L., Lewis, P. A., Crocker-Buque, A., Rutschmann, R., and Meyer, G. F. (2012). Premotor cortex is sensitive to auditory-visual congruence for biological motion. J. Cogn. Neurosci. 24, 575–587. doi: 10.1162/jocn_a_00173
Yu, H.-H., and Rosa, M. G. P. (2014). Uniformity and diversity of response properties of neurons in the primary visual cortex: selectivity for orientation, direction of motion and stimulus size from center to far periphery. Vis. Neurosci. 31, 85–98. doi: 10.1017/s0952523813000448
Yu, H.-H., Verma, R., Yang, Y., Tibballs, H. A., Lui, L. L., Reser, D. H., et al. (2010). Spatial and temporal frequency tuning in striate cortex: functional uniformity and specializations related to receptive field eccentricity. Eur. J. Neurosci. 31, 1043–1062. doi: 10.1111/j.1460-9568.2010.07118.x
Zaksas, D., and Pasternak, T. (2006). Directional signals in the prefrontal cortex and in area MT during a working memory for visual motion task. J. Neurosci. 26, 11726–11742. doi: 10.1523/jneurosci.3420-06.2006
Zeki, S. M., Watson, J. D., Lueck, C. J., Friston, K. J., Kennard, C., and Frackowiak, R. S. (1991). A direct demonstration of functional specialization in human visual cortex. J. Neurosci. 11, 641–649. doi: 10.1523/jneurosci.11-03-00641.1991
Keywords: visual motion, auditory motion, audiovisual integration, primates, cerebral cortex
Citation: Chaplin TA, Rosa MGP and Lui LL (2018) Auditory and Visual Motion Processing and Integration in the Primate Cerebral Cortex. Front. Neural Circuits 12:93. doi: 10.3389/fncir.2018.00093
Received: 03 August 2018; Accepted: 08 October 2018;
Published: 26 October 2018.
Edited by:Greg Stuart, Australian National University, Australia
Reviewed by:Sophie Wuerger, University of Liverpool, United Kingdom
Hulusi Kafaligonul, Bilkent University, Turkey
Copyright © 2018 Chaplin, Rosa and Lui. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.