Skip to main content

REVIEW article

Front. Integr. Neurosci., 30 September 2009
This article is part of the Research Topic Sensory processing disorder (SPD) View all 8 articles

Postnatal experiences influence how the brain integrates information from different senses

Department of Neurobiology and Anatomy, Wake Forest University School of Medicine, Winston-Salem, NC, USA
Sensory processing disorder (SPD) is characterized by anomalous reactions to, and integration of, sensory cues. Although the underlying etiology of SPD is unknown, one brain region likely to reflect these sensory and behavioral anomalies is the superior colliculus (SC), a structure involved in the synthesis of information from multiple sensory modalities and the control of overt orientation responses. In the present review we describe normal functional properties of this structure, the manner in which its individual neurons integrate cues from different senses, and the overt SC-mediated behaviors that are believed to manifest this “multisensory integration.” Of particular interest here is how SC neurons develop their capacity to engage in multisensory integration during early postnatal life as a consequence of early sensory experience, and the intimate communication between cortex and the midbrain that makes this developmental process possible.
One of the primary manifestations of sensory processing disorder (SPD) is an abnormal response to sensory cues that often manifests as hypo-responsiveness or hyper-responsiveness. Especially problematic is the inability of children with SPD to properly deal with cues from a variety of senses in order to initiate and control appropriate behaviors. Jean Ayres, a pioneer in this field, first referred to this disorder as Sensory Integration Disorder. She believed that it was due to a disruption in the brain’s use of sensory information to organize appropriate responses during its development and maturation (see Bundy et al., 2002 ; Schaaf and Miller, 2005 ). However, it is unclear what particular level or levels of the neuraxis are affected by this disorder, requiring investigators to account for all possibilities. The many other contributions to this Special Issue will deal more extensively with what is known about these issues. It is, however, important to note that sensory integration problems similar to those found in SPD have also been noted in other developmental disorders, such as autism, wherein individuals often experience abnormalities in multisensory integration by focusing extensively on visual processes and neglecting auditory signals (Lovaas et al., 1974 ; Berkell et al., 1996 ), and also in dyslexia and attention deficit disorder (Hairston et al., 2005 ; Tucha et al., 2006 ; Blau et al., 2009 ).
Regardless of the etiology of SPD or the particular form it takes in a given individual, what appears to be fundamentally altered in the brain is its sensory processing and sensorimotor transduction. The identification of specific genetic and/or neurological processes that underlie SPD have thus far eluded clinicians and scientists; however, one midbrain circuit involving the superior colliculus (SC) may provide a working model to better understanding manifestations of its underlying dysfunction. This sensorimotor structure is involved in integrating sensory information and in initiating and controlling the motor responses that are involved in attentive, localization, and orientation behaviors (see Stein and Meredith, 1993 ). Its functional integrity is very much dependent on a balance of inputs from different levels of the neuraxis, but is especially sensitive to inputs from association cortex (Stein, 2005 ). The maturation of its sensory properties takes place postnatally as a result of interactions with the environment (Wallace and Stein, 2007 ), and its function is to determine the immediate behavioral responses to the onset or change of sensory cues (see Stein and Meredith, 1993 ).
Of particular concern here is to understand the postnatal process through which SC neurons develop their properties and refine their ability to use multiple senses synergistically in the process of transforming incoming sensory information into outgoing motor commands. Understanding the postnatal elaboration of sensory and/or motor circuits may provide insights into how disruptions that take place at different maturational time points can have different consequences and may result in different neural strategies for dealing with complexes of sensory stimuli. This approach may help explain some of the striking variations in the dysfunction profiles of patients with early disorders in the processing of sensory information (see Miller et al., 2001 ). Using an animal model, the discussion below will focus on the developmental antecedents necessary for populations of SC neurons to develop their capabilities to integrate cues from different senses in order to initiate adaptive behaviors.

Multiple Sensory Systems

The brain has evolved to adapt information acquisition and response systems to produce rapid and accurate reactions to immediate environmental challenges. In order to facilitate these behaviors it has also created storage and retrieval systems that modify response strategies based on experience. The success in the evolving brain’s ability to use information quickly and efficiently was determined by whether any particular prototype would become a platform for further modification or would simply be discarded along with its owner. Nature is ruthless in its evaluation of innovation.
One result of these evolutionary pressures is that all brains utilize multiple sensory systems to extract information from different forms of environmental energy. By being tuned to different energy sources, each of these systems reflects a slightly different perspective of initiating environmental events. This is of particular benefit because those events or objects that cannot readily be identified along a single sensory dimension (e.g., how they look) may be identified along another (how they sound or feel) or through a combination of sensory dimensions. Combining sensory inputs is an extremely efficient technique for maximizing information utilization, and the brain employs a simple logic in using this multisensory information synergistically (Stein and Meredith, 1993 ). Indeed, the brain appears to synthesize this information in an optimal fashion (e.g., Ernst and Banks, 2002 ; see also Calvert et al., 2004 for general discussions). This process is called “multisensory integration,” one that enhances the probability of detecting the source of a signal, correctly identifying it, and, most importantly in the present context, properly reacting to it. Using its sensory machinery in this way, the brain achieves performance levels of effectiveness and efficiency that would be impossible if it were only able to use the senses independently.
But the brain is not automatically endowed with the full capability to use its multiple sensory systems in concert. Indeed, it uses them independently during early neonatal stages of development. Although it might be capable of matching information from one to another very early in life, it cannot synthesize their information to produce a new “integrated” multisensory response (see the discussion below). How it acquires this remarkable ability during postnatal life, and the physiological and behavioral consequences that can occur when this developmental process is disrupted, will be the primary subjects of this discussion. But first it is necessary to examine the normal operational features of multisensory integration, the underlying neural circuitry that is essential for its functional integrity, and the behavioral impact of this neural process.

Multisensory Integration

Multisensory integration works by significantly modulating the physiological salience of information in the brain (Stein et al., 2004 ), and its impact on perception and behavior is well-known (Talsma et al., 2006a, b , 2007 ; King and Palmer, 1985 ; Stein et al., 1989 ; Hughes et al., 1994 ; Frens and Van Opstal, 1995 ; Corneil and Munoz, 1996 ; Wallace et al., 1996 ; Liotti et al., 1998 ; Recanzone, 1998 ; Zangaladze et al., 1999 ; Grant et al., 2000 ; Sathian, 2000 , 2005 ; Marks, 2004 ; Newell, 2004 ; Sathian and Prather, 2004 ; Schroeder and Foxe, 2004 ; Shams et al., 2004 ; Woldorff et al., 2004 ; Woods and Recanzone, 2004 ; Busse et al., 2005 ; Weisser et al., 2005 ; Ghazanfar and Schroeder, 2006 ; Lakatos et al., 2007 ). It has been shown to aid in disambiguating events, including those involving human speech and animal communication (Sumby and Pollack, 1954 ; Bernstein et al., 2004 ; Massaro, 2004 ; Partan, 2004 ; Ghazanfar et al., 2005 ; Sugihara et al., 2006 ), and to enhance and speed the detection, identification and reaction to external events (Stein et al., 1989 ; Stein and Meredith, 1993 ; Hughes et al., 1994 ; Frens and Van Opstal, 1995 ; Corneil and Munoz, 1996 ; Marks, 2004 ; Newell, 2004 ; Sathian and Prather, 2004 ; Shams et al., 2004 ; Woods and Recanzone, 2004 ). The survival benefits of such a process are obvious.
The general “rule of thumb” of how multisensory integration operates is as follows: signals from the different individual senses that are likely to be relevant to the same event are normally concordant in space and time. As such, these spatiotemporally aligned cross-modal stimuli produce markedly enhanced signals in the adult brain. These enhanced signals lead to a higher probability of detecting and reacting to the event. Cues from the different senses that are neither in temporal nor spatial register are likely to be unrelated and will fail to produce such enhancement. Indeed, under some circumstances, such cues might be treated as competitors, leading to mutual signal degradation and decreased likelihood of detecting and reacting to either cue.
Multisensory integration enriches our sensory experiences and increases the accuracy of our judgments of environmental events. It is interesting to note that the potency of combining cues from different senses increases as their individual effectiveness decreases, referred to as “inverse effectiveness” (see Stein and Meredith, 1993 ). Thus, multisensory integration is of maximal utility when task difficulty increases, either because stimuli are weak or background noise is strong. This issue is of particular relevance to those suffering with SPD.
Although all brains engage this strategy of sensory processing and do so at multiple levels of the neuraxis (Stein and Stanford, 2008 ), surprisingly little attention has been directed toward understanding how it develops in the young brain and/or the inherent plasticity in this system. Newer studies have shown that its operational principles are very sensitive to experience and that there is considerable postnatal development in these processes. Experiments in animals strongly suggest that multisensory integration is acquired postnatally and only after considerable sensory experience (Wallace et al., 1993 , 2006 ; Wallace and Stein, 1997 , 2007 ). As noted earlier, this capacity refers to the ability to synthesize information from different senses so that there is a new product. Very recent studies in human subjects seem to agree with this developmental profile by showing a gradual postnatal elaboration of multisensory integration capabilities (see Bremmer et al., 2008a ,b ; Neil et al., 2006 ; Putzar et al., 2007 ; Gori et al., 2008 ). This should not be taken to mean that the maturation of every brain structure with multisensory neurons has already been explored, or that no communication across the senses is possible at birth. Some multisensory processes, like cross-modal matching and the recognition of amodal stimulus properties (e.g., size, intensity, frequency) have been demonstrated in very young children and infants (e.g., see Streri and Gentaz, 2003) , and some infer that they are already present to some degree at birth (e.g., for discussions see Bremmer et al., 2008a ,b ; Flom and Bahrick, 2007 ; Gibson, 1966 , 1969 , 1979 ; Werner, 1973 ; Bower, 1974 ; Marks, 1978 ; Meltzoff and Borton, 1979 ; Meltzoff, 1990 ; Morrongiello et al., 2003 ; Streri and Gentaz, 2003 ; Lewkowicz and Kraebel, 2004 ; Lickliter and Bahrick, 2004 ; ; but see also Maurer et al., 1999 ). However, the technical difficulty in evaluating these capabilities before the neonate is able to obtain substantial experience with cross-modal cues is obvious, and this issue also deserves continued exploration.

The Superior Colliculus: A Model of Multisensory Integration

As noted earlier, the SC is a midbrain structure that plays a significant role in attentive, localization and orientation behavior. A sudden event will engage its circuitry so that the event is detected, located in space and responded to with a shift of the eyes and head, and sometimes the entire body. The result is that the individual faces the source of that input and is in the best position to further evaluate and react to it. The SC is a layered structure and its deeper layers contain neurons responsive to visual, auditory or somatosensory (tactile) stimuli, and oftentimes to more than one sensory modality. This structure is an excellent model system for exploring the maturation of multisensory integration because of the volume of information about the maturation of its unisensory properties (Stein, 1984 ), because it is a primary site of sensory convergence (Stein and Meredith, 1993 ; Wallace et al., 1993 ), because of the extensive information already available about its multisensory processes in the adult (see Stein and Stanford, 2008 for a recent review), and because much of what is already known about the development and organization of multisensory integration comes from this model (Stein et al., 1973a , 1976 ; Stein and Arigbede, 1972 ; Stein and Clamann, 1981 ; Stein, 1984 ; Peck, 1987 ; Stein and Meredith, 1993 ; Stein et al., 1993 ; Wallace, 2004 ) but see also (Jay and Sparks, 1987a ,b ; Groh and Sparks, 1996a, b ; Zwiers et al., 2003 ; Barth and Brett-Green, 2004 ; Calvert and Lewis, 2004 ; Gutfreund and Knudsen, 2004 ; King et al., 2004 ; Sathian and Prather, 2004 ; Woods and Recanzone, 2004 ; Lakatos et al., 2007 ; Senkowski et al., 2007 ). In addition, the fact that it has a well-defined role in overt behavior makes it possible to examine the behavioral implications of its physiological properties.
Multisensory neurons have been studied most extensively in the cat, and it is often used as a source of information about multisensory integration at the level of the single neuron. It will thus be the primary model of reference below; however, relevant information from the monkey and other species will be incorporated to illustrate the generality of the observations. The functional circuits underlying SC multisensory integration and its maturation are complex, and the simple convergence of afferents onto a given neuron does not guarantee it will develop the ability to integrate cross-modal information, nor does it specify how the information is to be integrated (Jiang et al., 2001 , 2006 ; Stein and Meredith, 1993 ; Wallace and Stein, 1997 ;). A number of critical factors must be met during postnatal life, and these will be discussed below. But first it is necessary to describe multisensory integration in its mature form, and outline the underlying circuits and computations used in this process.

Observations from Adult Animals

The visual, somatosensory, and auditory inputs to the deep SC of the cat are each represented in map-like fashion, and each map is in spatial register with the others (Stein et al., 1975 , 1976 , 1993 ; Stein and Gallagher, 1981 ; Middlebrooks and Knudsen, 1984 ; Meredith and Stein, 1990 ; Meredith et al., 1991 ). The maps are formed by a clustering of the neurons responsive to stimuli from the same region of space defined by each neuron’s receptive field. Thus in the SC, neurons responsive to visual and/or auditory stimuli that have their receptive fields in frontal space are located in the rostral aspect of the structure. They are located near somatosensory neurons whose receptive fields are on the face. Visually responsive neurons with receptive fields in eccentric space are located toward the caudal aspect of the structure and abut auditory-responsive neurons whose receptive fields are also in eccentric spatial locations as well as somatosensory-responsive neurons whose receptive fields are toward the rear of the body. These overlapping sensory maps are also in spatial register with a premotor map that connects to the brainstem and spinal cord and represents a motor plan with which to respond (orient) to a stimulus. Therefore, regardless of which sense is providing information, an object or event in a given region of space activates neurons in the SC location that will drive orientation responses toward it – a highly efficient way to match incoming information from various sensory receptor organs with outgoing “motor” signals to the effectors controlling the orientation of those organs (Wurtz and Goldberg, 1971 ; Harris, 1980 ; Wurtz and Albano, 1980 ; Stein and Clamann, 1981 ; Grantyn and Grantyn, 1982 ; Jay and Sparks, 1984 , 1987a ,b ; Sparks, 1986 ; Peck, 1987 ; Sparks and Nelson, 1987 ; Guitton and Munoz, 1991 ; Munoz and Wurtz, 1993a ,b ; Groh and Sparks, 1996a ,b ).
The spatial register among SC sensory maps is formed, in large part, by multisensory neurons that have multiple receptive fields, one for each sensory modality to which they respond. These receptive fields are in spatial register with one another (Meredith et al., 1991 , 1992 ; Meredith and Stein, 1990 ; King et al., 1996 ; see also Gaither and Stein, 1979 ; Newman and Hartline, 1981 ; Knudsen and Brainard, 1991 ; Gutfreund and Knudsen, 2004 ; Zahar et al., 2009 ), such that a visual–auditory neuron has its visual and auditory receptive fields overlapping one another in space (see Figure 1 ). This cross-modal spatial register is one of the basic determinants of how multisensory information is synthesized (Stein and Meredith, 1993 ), and the resultant modulation in SC signals is of primary interest here. Integrating signals from different senses can result in substantial increases or decreases in response magnitude (number of impulses). However, multisensory integration depends on a variety of factors, one of the most important of which is the relative location of the stimuli. When stimuli from different senses originate from roughly the same location at the same time (e.g., from the same event), they will fall within the overlapping receptive fields of a given multisensory neuron and can enhance its responses above those to either stimulus alone, and often above the sum of the two responses (again see Figure 1 ).
Figure 1. A representative visual–auditory multisensory neuron. Top: visual space and visual receptive fields are shown on a hemisphere in which each concentric circle represents 10° of visual angle. The intersection of the horizontal and vertical meridians is the point directly in front of the animal. For auditory space, the full hemisphere represents frontal auditory space while the half-hemisphere represents caudal right space (for purposes of two-dimensional rendering, it has been split and folded forward). The visual stimulus was a small bar of light moved through the receptive field as indicated by the arrow, and the auditory stimulus was a broadband noise burst delivered from a speaker placed within the auditory receptive field (speaker location is depicted by the icon). Cross-modal stimuli consisted of the visual and auditory stimuli presented together. Visual and auditory receptive fields are shaded in blue and green. Bottom: Neuronal responses for visual (blue), auditory (green), and cross-modal (yellow) conditions are demonstrated with rasters and peristimulus time histograms. To the right: Bar graphs summarizing the mean response to all three conditions with the proportional enhancement observed in the cross-modal condition represented in orange.

The Importance of Cortical Influences in the Multisensory SC Circuit

It may seem surprising that the ability of a multisensory cat SC neuron to integrate its multiple sensory inputs, even when they are derived from the same place at the same time, is not a “given.” It depends on the presence of influences from cortex (see Figure 2 ). This is not because those cortical influences render the SC neuron multisensory, as this is very rarely the case because SC neurons receive different sensory inputs from multiple sensory sources. Rather, it is because they are specifically required by the SC neuron to engage in multisensory integration. This conclusion was based on the results of a number of studies in cat showing that functional deactivation of the inputs from areas of association cortex disrupted multisensory integration in SC neurons but not their ability to respond to cues from different senses (Jiang et al., 2001 ; Jiang and Stein, 2003 ; Stein et al., 2002 ; see also Burnett et al., 2007 ). The relevant cortical areas in the cat are the anterior ectosylvian sulcus (AES) and the adjacent area, the rostral aspect of the lateral suprasylvian sulcus (rLS). Their homologues in the primate brain have not yet been identified.
Figure 2. The multisensory enhancement of many SC neurons depends exclusively on influences from AES. Top: the visual and auditory receptive fields of this neuron are depicted by the shaded areas on the illustrations of visual and auditory space with the same stimulus conventions as in Figure 1 . Bottom: (A) Shown are the neuron’s responses to the individual modality-specific stimuli and to their co-presentation before cortical deactivation (“control”). The movement of the visual stimulus is represented by the ramp labeled “V” and the auditory stimulus by the square wave labeled “A”. Below each stimulus trace are rasters and peristimulus time histograms that illustrate the neuron’s responses. Although the neuron showed a slightly better response to the auditory than to the visual stimulus, its best response was obtained when the two stimuli were presented in combination (“VA”). As shown in the summary bar graph at the far right, the multisensory response exceeded the dominant unisensory response by 147% and also exceeded the sum of its unisensory component responses (dashed line). (B) Multisensory response enhancement was eliminated by deactivating AES, but unisensory responses were not significantly affected. (C) Multisensory response enhancement was re-established 10 min after terminating cortical deactivation and initiating rewarming. (D,E) Deactivating and reactivating the rLS had no significant effect on the responses of this neuron. Significantly enhanced multisensory responses as compared to the dominant unisensory (auditory) response, t-test: *p < 0.05; **p < 0.01. Adapted from Jiang et al. (2001 )
These association cortical areas are important in this context, as other cortical inputs to the cat SC do not appear to be specifically important for multisensory integration (Wilkinson et al., 1996 ; Jiang et al., 2007 ). Of the two, the most important are the inputs from AES. These inputs are unisensory and match the modality-convergence profile of the SC neuron they target (Wallace et al., 1992 ). In other words, a visual–auditory SC neuron that can integrate its inputs will receive converging inputs from the visual and auditory subdivisions of AES that will match visual and auditory inputs the SC neuron receives from non-AES sources (Figure 2 ).

The Development of SC Multisensory Integration is a Protracted Postnatal Process

However, at birth, neither the cat nor the monkey SC contains neurons capable of integrating cross-modal stimuli (Wallace and Stein, 1997 , 2001 ). Because the SC develops earlier than does the cerebral cortex (see Stein et al., 1973a, b ), it is a good bet that higher order areas of the brain are also incapable of synthesizing sensory information from different modalities, a supposition borne out by recent studies in cortex (Wallace et al., 2006 ; Carriere et al., 2007 ). Recent perceptual studies in humans also support the idea that the integrative capacity of the brain appears to develop over time (Neil et al., 2006 ; Putzar et al., 2007 ).
Because the cat is born at a comparatively early stage of maturation, it was a good model within which to examine this process. At birth, its SC neurons appear highly immature and all of them, even those in its multisensory layers, are unisensory (Stein et al., 1973; Wallace and Stein, 1997 ). The only sensory-responsive SC neurons at this stage of life are somatosensory (Stein et al., 1973), and these neurons first appear prenatally, presumably in part, to prepare the neonate with the perioral responsiveness necessary to facilitate its quest for the nipple (Larson and Stein, 1984 ). The ear canals and the eyes are closed at birth, and auditory neurons do not begin appearing until about five postnatal days (see Stein et al., 1973), and visual neurons do not begin appearing until about three postnatal weeks (Kao et al., 1994 ). Neonatal SC neurons have very large receptive fields; they rapidly habituate to stimuli, and do not yet show the high selectivity for the stimulus parameters (e.g., velocity, direction of movement) that are characteristic in the adult. Their receptive fields shrink and their stimulus selectivity matures over the next few weeks of life. However, their multisensory development is far more protracted.
An inherent distinction between multisensory and unisensory processes should be noted here, as it is relevant to what one can expect of their development. Multisensory processes are unique in that they are integrating independent channels of information (Alvarado et al., 2007a ,b ; Ernst and Banks, 2002 ; Gingras et al., 2006 ; Rowland et al., 2007 ). Thus, while what we know about unisensory development helps define the likely period of sensory maturation, we cannot extrapolate from that the principles governing multisensory maturation. A case in point in the maturation of visual cortex neurons is their development of specialization for a “preferred” orientation. When exposed to all line orientations or complex forms, different subpopulations of visual cortical neurons ultimately “prefer” a given line orientation, with all orientations represented among the specialized neuronal subgroups. In contrast, the end-point in the maturation of multisensory integration in both SC and cortical neurons is a more generalized product. Multisensory neurons are likely to be designed to extract the statistical regularities in space and time between the component stimuli in cross-modal events so that they can establish general principles to guide responses to these events.
The first multisensory neuronal type to develop in the SC is the somatosensory–auditory neuron. These neurons do not begin appearing until approximately postnatal day 10, very soon after the deep SC is capable of responding to auditory cues. Visual–nonvisual neurons begin appearing at three postnatal weeks, as soon as the deep SC begins responding to visual cues (the purely visual overlying layers develop considerably earlier, see Stein et al., 1973; Kao et al., 1994 ). All of these new multisensory neurons lack the ability to engage in multisensory integration. They can respond to the different sensory inputs individually, but when presented with spatiotemporally concordant cross-modal stimuli, cannot integrate across them to generate enhanced responses. This capacity requires a far longer developmental time course (2 to 3+ months), during which there is considerable maturation in cortico-collicular projections (Stein and Gallagher, 1981 ; Wallace and Stein, 1997 , 2000 ; Stein et al., 2002 ).

Why is it so Protracted?

The long postnatal time course of multisensory maturation suggests that this capacity does not depend on a passive process that simply unfolds over time. This is because the developmental period is one in which the brain is also acquiring a great deal of sensory experience. Could it be that the prolonged maturation of this capacity is specifically because such experiences encourage it and direct it? We believe so. By extracting the statistical regularities of early life experiences with cross-modal stimuli, the brain can adapt multisensory integration to the demands of the specific environment in which it will be used.
This, of course, requires that multisensory integration is plastic, something generally associated more with cortical than midbrain processes, and suggests that cross-modal experiences may be instantiated in the SC via the cortico-collicular pathway. Given the well-known sensitivity of cortex to sensory experience (Buonomano and Merzenich, 1998 ) and the comparative insensitivity of the SC to such experiences (Wickelgren and Sterling, 1969 ), as well as the fact that cortical influences are critical for SC multisensory integration, this seemed to be a highly likely possibility. Therefore, a multi-pronged approach was designed to explore these and other related issues by examining: (1) the effects of rearing animals without the possibility of experience with certain cross-modal cues (e.g., visual–nonvisual); (2) the effects of rearing animals by perturbing the spatial concordance of the cross-modal cues normally associated with the same event; and (3) the maturation of SC multisensory integration either in the absence of its critical cortical inputs or by deactivating (reversibly) this portion of the multisensory SC circuit so that that it would not have access to the sensory experience normally acquired during early life.
Dark Rearing eliminates the possibility of obtaining early visual–nonvisual experience. It therefore affords the opportunity to examine how the absence of such experience affects the development of visual–nonvisual integration and SC function (Wallace et al., 2004b ). After rearing cats from birth to 6 months in that way, the SC was found to have developed all of its characteristic unisensory neurons (visual, auditory, somatosensory) and populations of each possible multisensory neuronal subtype. Furthermore, the relative size of each of these populations was not very different from that found in the normal SC. But, these neurons had very large receptive fields that were more like the neonatal state than the adult state (Figure 3 ). The neurons were also incapable of multisensory response enhancement. So, with respect to both their size and lack of multisensory integration capability, they seemed neonatal, as if their development had been arrested. This was consistent with several possibilities (e.g., light exposure might be essential for multisensory integration), the most compelling of which appeared to be that experience with a given combination of senses is essential for the maturation of multisensory integration capabilities. As noted earlier, this might be because the system evolved in a way that requires experience with cross-modal stimuli to adapt multisensory integration to fit the presumptive sensory environment in which it will be used. In the absence of this experience, the development does not proceed. If true, this would also mean that very different cross-modal experiences will produce very different multisensory products.
Figure 3. (A) Multisensory integration, which is present in SC neurons in animals raised in a normal illuminated environment, is significantly diminished in animals raised in complete darkness. Rasters and peristimulus time histograms show the responses of two neurons, one from a normal control and one from a dark-reared animal, to both modality-specific and cross-modal stimuli. Summary bar graphs illustrate the mean responses for each of the conditions and the magnitude of the multisensory interaction. The “sum” represents the response predicted based on the addition of the two unisensory component responses. *p < 0.05. “A,” Auditory; “V,” visual; “VA,” visual–auditory. (B) Receptive fields in dark-reared animals were significantly larger than those in control animals. Receptive fields (shading) are shown for three representative multisensory neurons in control (left) and dark-reared (right) animals. Conventions are the same as in Figure 2 . The central bar graph plots the relative sizes for the visual, auditory, and somatosensory receptive fields of multisensory neurons in the SC of normal (gray bars) and dark-reared (black bars) animals. Note that these mean measures are standardized to receptive field sizes in normal adults (100%). Values in parentheses represent the number of neurons in each group. *p < 0.05; **p < 0.01. Adapted from Wallace et al. (2004)
Spatial-Disparity Rearing is one way to test whether the basic properties of multisensory integration will adapt to the specifics of early experience. Animals were raised from birth to 6 months of age in a dark room in which visual–auditory stimuli were presented periodically. The stimuli were synchronous, as if derived from the same event, but they were spatially disparate. The speakers and lights from which they were elicited were fixed to different spots on the wall. Most SC neurons of these animals appeared to be insensitive to this rearing condition. They had large receptive fields and were unable to engage in multisensory integration (Figure 4 ), and looked just like those of dark-reared or neonatal animals (Wallace and Stein, 2007 ). However, many showed the impact of this experience by developing poorly aligned visual–auditory receptive fields, some of which were completely out of spatial register with one another as they had no areas in common (see Figure 4 ). This has not been seen in the SC of cats raised in a normal environment or those raised in the dark. As a result of this “anomalous” development, only those visual–auditory stimuli that were highly disparate in space could fall simultaneously in the respective receptive fields of these multisensory neurons, and when they did, they enhanced the magnitude of the neuron’s response. This represented a reversal of the normal “spatial principle” of multisensory integration in which only spatially congruent cross-modal stimuli yield response enhancement, and spatially disparate stimuli either yield response depression or have no effect on one another (see Meredith and Stein, 1996 ; Kadunce et al., 1997 ).
Figure 4. Animals reared with spatially disparate visual and auditory stimuli develop atypical receptive fields and multisensory integration. (A) Receptive field overlap in neurons from normal and spatial disparity reared animals. Top: A sampling of all SC quadrants (left) yielded visual (red) and auditory (green) receptive fields whose center distributions are shown (right). Bottom: % receptive field overlap in control animals (yellow) was often 91–100%; far exceeding that seen in animals reared with spatially disparate visual and auditory stimuli (red, often <10%). (B) An example of the non-overlapping receptive fields of an animal reared with visual–auditory spatial disparity. The visual receptive field is in blue and the auditory in green. Note the atypical multisensory integration: when visual and auditory stimuli were placed within their respective receptive fields they were spatially disparate, but produced significant (<0.05, t-test) enhancement. This is a striking reversal of the normal condition wherein these receptive fields overlap one another and spatially coincident stimuli are required for multisensory integration. However, this result is consistent with the animal’s abnormal multisensory experience. Adapted from Wallace and Stein (2007) .
This result strongly supported the contention that experience with cross-modal cues early in life crafts the neural circuits underlying multisensory integration. That association cortex is essential for guiding this process is suggested by two observations: first, its presence during development is necessary for SC multisensory integration to appear; second, the maturation of its influences on SC neurons parallels their development of multisensory integration capabilities (Wallace and Stein, 2000 ).
Developing an SC without input from association cortex is a tactic used by Jiang et al. (2006) to assess the normal impact of these descending influences on guiding the maturation of SC multisensory integration. As it turns out, neonatal removal of this cortex precluded the maturation of SC multisensory integration. Its influence appeared to be critical, as other cortical areas did not compensate for its loss. Interestingly, however, both AES and rLS had to be removed for this to happen. If only one of the areas was removed, compensatory plasticity was induced in the other, and most SC neurons developed their characteristic multisensory capabilities (Jiang et al., 2006 ). These experiments provided another important observation. In the absence of these cortical inputs, multisensory SC neurons develop poor alignment among their different receptive fields (Figure 5 ). Apparently, association cortex plays an important role in guiding SC receptive field alignment and, thereby, constraining how stimuli will be processed during multisensory integration, even when those receptive fields involve inputs derived from other afferent sources.
Figure 5. Three characteristic examples of the loss of visual–auditory receptive field overlap following combined lesions of the anterior ectosylvian sulcus (AES) and the rostral lateral suprasylvian (rLS). The bar graphs represent the percentage of visual–auditory receptive field register in SC neurons. There is a high degree of receptive field overlap in normal controls, however that overlap was significantly degraded in animals with combined AES and rLS lesions, but not in those with lesions restricted to AES or rLS; vertical bars SE, *p < 0.05). Adapted from Jiang et al. (2006) .
Presumably, the influences of association cortex ensure that SC multisensory processes properly reflect the physical configurations of the cross-modal events that have been experienced regardless of whether these are typical (e.g., “normal”) or atypical. In the absence of cortex there is no way to translate experience with cross-modal cues so that the default condition approximates the neonatal state in which there is no multisensory integration.

Chronic Cortical Deactivation during Development

Perhaps the most powerful evidence for this integral role of association cortex in the maturation of SC multisensory integration comes from experiments that are still ongoing (Stein and Rowland, 2007 ; Stein et al., 2008 ). These experiments involve rendering the neonatal association cortex unable to gather early cross-modal experiences (see Rowland et al., 2005 ; Stein and Rowland, 2007 ; Stein et al., 2008 ). This is accomplished by implantation of muscimol-impregnated elvax pledgets that block intrinsic cortical activity for many weeks. Once the muscimol stores are depleted, or the pledgets are removed, cortex becomes active again. Preliminary observations indicate that such a disruption of cortical activity during the period in which multisensory integration normally develops interferes with the process. SC neurons still become multisensory (i.e., they respond to more than one sensory modality), and develop their characteristic unisensory properties, but are unable to integrate cross-modal information to enhance SC-mediated behaviors, even after cortex is reactivated. These observations strongly point to cortex as the portal through which cross-modal experience has access to the brain’s multisensory circuitry.

Development/Maintenance of Multisensory Integration in the Adult

In the chronic cortical deactivation experiments, the cortex was deactivated unilaterally and the deficits were, as expected, confined to ipsilateral SC neurons and to multisensory responses to stimuli in contralateral sensory space. However, in retests after 4 years of normal experience, there was a startling result. The previous deficits disappeared, suggesting that appropriate sensory experience in adulthood could compensate for cross-modal anomalies developed early in life. This is of more than passing interest in the current context or to those with early visual and/or auditory deficits that can be corrected later in life via drug therapy or prosthetic devices. The possibility of adult plasticity in this context also raises interesting questions about the requirements for maintaining normal multisensory integration in adulthood. We have already found that normal adult cats placed in the dark for up to 6 months retain their normal capability for multisensory integration (unpublished observations), indicating that it can be retained without constant updating and in the absence of conflicting information. It would, however, be of considerable interest to determine if altering the rules of, for example, visual–auditory events, as in disparity rearing, would produce a change in the principles of integration. This would suggest an enduring plasticity in this function, albeit one that might change with age and experience.

Some Implications

Nevertheless, for those individuals with anomalies of sensory integration and/or sensorimotor transduction, the plastic nature of these processes throughout life offers promise for successful therapeutic intervention. Because of the general role of the SC in integrating information from multiple senses to detect external events and initiate adaptive responses to them, it is highly likely that it plays a role in the manifestations of whatever lies at the neurological core of SPD. However, any anomalies in these overt behaviors will impact the feedback generated from caregivers and others, as well as the individual’s internal representations of external events and peripersonal space. These could contribute to the established difficulties in dealing appropriately with external stimuli, especially when multiple sensory cues are being generated in complex environments that require a rapid choice among competing alternatives. The potentially destructive cycle is obvious. At present, many of the ameliorative techniques that deal with the symptoms of SPD, which are discussed in other contributions to this special issue of Frontiers in Integrative Neuroscience, are likely dealing in large part with the circuit underlying their behavioral expression. This would include the cortical–collicular axis. In this vein it would be interesting to consider how much of the practical success that has already been achieved in dealing with this condition, via physical and occupational therapy, reflects alterations in this circuit. Unfortunately, our current understanding of the neurological basis of SPD is rudimentary compared to that of many other developmental disorders. Only by addressing the many issues raised in this special issue of the Journal will we begin to understand its origin and expression and be able develop the most effective strategies for minimizing its disruptive consequences.

Conflict of Interest Statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Acknowledgments

The research described was supported in part by NIH grants EY016716 and NS036916, and a grant from the Wallace Research Foundation.

References

Alvarado, J. C., Stanford, T. R., Vaughan, J. W., and Stein, B. E. (2007a). Cortex mediates multisensory but not unisensory integration in superior colliculus. J. Neurosci. 27, 12775–12786.
Alvarado, J. C., Vaughan, J. W., Stanford, T. R., and Stein, B. E. (2007b). Multisensory versus unisensory integration: contrasting modes in the superior colliculus. J. Neurophysiol. 97, 3193–3205.
Barth, D. S., and Brett-Green, B. (2004). Multisensory-evoked potentials in rat cortex. In The Handbook of Multisensory Processing, G. A. Calvert, C. Spence and B. E. Stein, eds (Cambridge, MA, The MIT Press), pp. 357–370.
Berkell, D. E., Malgeri, S. E., and Streit, M. K. (1996). Auditory integration training for individuals with autism. Educ Train Ment Retard Dev Disabil 31, 66–70.
Bernstein, L. E., Auer, E. T., and Moore, J. K. (2004). Audiovisual speech binding: convergence or association. In The Handbook of Multisensory Processes, G. A. Calvert, C. Spence and B. E. Stein, eds (Cambridge, MA, The MIT Press), pp. 203–224.
Blau, V., van Atteveldt, N., Ekkebus, M., Goebel, R., and Blomert, L. (2009). Reduced neural integration of letters and speech sounds links phonological and reading deficits in adult dyslexia. Curr. Biol. 19, 503–508.
Bower, T. G. R. (1974). The evolution of sensory systems. In Perception: Essays in Honor of James J. Gibson, R. B. Macleod and H. L. Pick Jr, eds (Ithaca, NY, Cornell University Press), pp. 141–165.
Bremmer, A. J., Mareschal, D., Lloyd-Fox, S., and Spence, C. (2008a). Spatial localization of touch in the first year of life: early influences of a visual spatial code and the development of remapping across changes in limb position. J. Exp. Psychol. Gen. 137, 149–162.
Bremmer, A. J., Holmes, N. P., and Spence, C. (2008b). Infants lost in (peripersonal) space? Cell 12, 298–305.
Bundy, A. C., Lane, S. J., Fisher, A. G., and Murray, E. A. (2002). Sensory Integration: Theory and Practice. Philadelphia, PA, F.A. Davis.
Buonomano, D. V., and Merzenich, M. M. (1998). Cortical plasticity: from synapses to maps. Annu. Rev. Neurosci. 21, 149–186.
Burnett, L. R., Stein, B. E., Perrault T. J. Jr., and Wallace, M. T. (2007). Excitotoxic lesions of the superior colliculus preferentially impact multisensory neurons and multisensory integration. Exp. Brain Res. 179, 325–338.
Busse, L., Roberts, K. C., Crist, R. E., Weissman, D. H., and Woldorff, M. G. (2005). The spread of attention across modalities and space in a multisensory object. Proc. Natl. Acad. Sci. USA 102, 18751–18756.
Calvert, G. A., and Lewis, J. W. (2004). Hemodynamic studies of audiovisual interactions. In The Handbook of Multisensory Processes , G. A. Calvert, C. Spence and B. E. Stein, eds (Cambridge, MA, The MIT Press), pp. 483–502.
Calvert, G. A., Spence, C., and Stein, B. E. (eds) (2004). The Handbook of Multisensory Processes. Cambridge, MA, The MIT Press.
Carriere, B., Royal, D. W., Perrault, T. J. Jr., Morrison, S. P., Vaughan, J. W., Stein, B. E., and Wallace, M. T. (2007). Visual deprivation alters the development of cortical multisensory integration. J. Neurophysiol. 98, 2858–2867.
Corneil, B. D., and Munoz, D. P. (1996). The influence of auditory and visual distractors on human orienting gaze shifts. J. Neurosci. 16, 8193–8207.
de Gelder, B., Vroomen, J., and Pourtois, G. (2004). Multisensory perception of emotion, its time course and its neural basis. In The Handbook of Multisensory Processes, G. A. Calvert, C. Spence and B. E. Stein, eds (Cambridge, MA, The MIT Press), pp. 581–596.
Ernst, M. O., and Banks, M. S. (2002). Humans integrate visual and haptic information in a statistically optimal fashion. Nature 415, 429–433.
Flom, R., and Bahrick, L. E. (2007). The development of infant discrimination of affect in multimodal and unimodal stimulation: the role of intersensory redundancy. Dev. Psychol. 43, 238–252.
Fort, A., and Giard, M. H. (2004). Multiple electrophysiological mechanisms of audiovisual integration in human perception. In The Handbook of Multisensory Processes, G. A. Calvert, C. Spence and B. E. Stein, eds (Cambridge, MA, The MIT Press), pp. 503–514.
Frens, M. A., and Van Opstal, A. J. (1995). A quantitative study of auditory-evoked saccadic eye movements in two dimensions. Exp. Brain Res. 107, 103–117.
Gaither, N. S., and Stein, B. E. (1979). Reptiles and mammals use similar sensory organizations in the midbrain. Science 205, 595–597.
Ghazanfar, A. A., Maier, J. X., Hoffman, K. L., and Logothetis, N. K. (2005). Multisensory integration of dynamic faces and voices in rhesus monkey auditory cortex. J. Neurosci. 25, 5004–5012.
Ghazanfar, A. A., and Schroeder, C. E. (2006). Is neocortex essentially multisensory? Trends Cogn. Sci. 10, 278–285.
Gibson, J. J. (1966). The Senses Considered as Perceptual Systems. Boston, MA, Houghton Mifflin.
Gibson, J. J. (1969). Principles of Perceptual Learning and Development. Englewood Cliffs, NJ, Prentice Hall.
Gibson, J. J. (1979). An Ecological Approach to Perception. Boston, MA, Houghton Mifflin.
Gingras, G., Rowland, B. E., and Stein, B. E. (2006). Unisensory Versus Multisensory Integration: Computational Distinctions in Behavior. Program No. 639.9. 2006 Neuroscience Meeting Planner. Atlanta, GA, Society for Neuroscience (Online).
Gori, M., Del Viva, M., Sandini, G., and Burr, D. C. (2008). Young children do not integrate visual and haptic form information. Curr. Biol. 18, 694–698.
Grant, A. C., Thiagarajah, M. C., and Sathian, K. (2000). Tactile perception in blind Braille readers: a psychophysical study of acuity and hyperacuity using gratings and dot patterns. Percept. Psychophys. 62, 301–312.
Grantyn, A., and Grantyn, R. (1982). Axonal patterns and sites of termination of cat superior colliculus neurons projecting in the tecto-bulbo-spinal tract. Exp. Brain Res. 46, 243–256.
Groh, J. M., and Sparks, D. L. (1996a). Saccades to somatosensory targets. II. Motor convergence in primate superior colliculus. J. Neurophysiol. 75, 428–438.
Groh, J. M., and Sparks, D. L. (1996b). Saccades to somatosensory targets. III. Eye-position-dependent somatosensory activity in primate superior colliculus. J. Neurophysiol. 75, 439–453.
Guitton, D., and Munoz, D. P. (1991). Control of orienting gaze shifts by the tectoreticulospinal system in the head-free cat. I. Identification, localization, and effects of behavior on sensory responses. J. Neurophysiol. 66, 1605–1623.
Gutfreund, Y., and Knudsen, E. I. (2004). Visual instruction of the auditory space map in the midbrain. In The Handbook of Multisensory Processes, G. A. Calvert, C. Spence and B. E. Stein, eds (Cambridge, MA, The MIT Press), pp. 613–624.
Hairston, W. (2001). Effects of Interacting Visual and Auditory Cues on Human Perception. Winston-Salem, NC, Wake Forest University, pp. 1–54.
Hairston, W. D., Burdette, J. H., Flowers, D. L., Wood, F. B., and Wallace, M. T. (2005). Altered temporal profile of visual–auditory multisensory interactions in dyslexia. Exp. Brain Res. 166, 474–480.
Harris, L. R. (1980). The superior colliculus and movements of the head and eyes in cats. J. Physiol. 300, 367–391.
Hughes, H. C., Reuter-Lorenz, P. A., Nozawa, G., and Fendrich, R. (1994). Visual–auditory interactions in sensorimotor processing: saccades versus manual responses. J. Exp. Psychol. Hum. Percept. Perform. 20, 131–153.
Jay, M. F., and Sparks, D. L. (1984). Auditory receptive fields in primate superior colliculus shift with changes in eye position. Nature 309, 345–47.
Jay, M. F., and Sparks, D. L. (1987a). Sensorimotor integration in the primate superior colliculus. I. Motor convergence. J. Neurophysiol. 57, 22–34.
Jay, M. F., and Sparks, D. L. (1987b). Sensorimotor integration in the primate superior colliculus. II. Coordinates of auditory signals. J. Neurophysiol. 57, 35–55.
Jiang, W., Jiang, H., Rowland, B. A., and Stein, B. E. (2007). Multisensory orientation behavior is disrupted by neonatal cortical ablation. J. Neurophysiol. 97, 557–562.
Jiang, W., Jiang, H., and Stein, B. E. (2006). Neonatal cortical ablation disrupts multisensory development in superior colliculus. J. Neurophysiol. 95, 1380–1396.
Jiang, W., and Stein, B. E. (2003). Cortex controls multisensory depression in superior colliculus. J. Neurophysiol. 90, 2123–2135.
Jiang, W., Wallace, M. T., Jiang, H., Vaughan, J. W., and Stein, B. E. (2001). Two cortical areas mediate multisensory integration in superior colliculus neurons. J. Neurophysiol. 85, 506–522.
Kadunce, D. C., Vaughan, J. W., Wallace, M. T., Benedek, G., and Stein, B. E. (1997). Mechanisms of within- and cross-modality suppression in the superior colliculus. J. Neurophysiol. 78, 2834–2847.
Kao, C. Q., McHaffie, J. G., Meredith, M. A., and Stein, B. E. (1994). Functional development of a central visual map in cat. J. Neurophysiol. 72, 266–272.
King, A. J., Doubell, T. P., and Skaliora, I. (2004). Epigenetic factors that align visual and auditory maps in the ferret midbrain. In The Handbook of Multisensory Processes, G. A. Calvert, C. Spence and B. E. Stein, eds (Cambridge, MA, The MIT Press), pp. 613–624.
King, A. J., and Palmer, A. R. (1985). Integration of visual and auditory information in bimodal neurones in the guinea-pig superior colliculus. Exp. Brain Res. 60, 492–500.
King, A. J., Schnupp, J. W., Carlile, S., Smith, A. L., and Thompson, I. D. (1996). The development of topographically-aligned maps of visual and auditory space in the superior colliculus. Prog. Brain Res. 112, 335–350.
Knudsen, E. I., and Brainard, M. S. (1991). Visual instruction of the neural map of auditory space in the developing optic tectum. Science 253, 85–87.
Lakatos, P., Chen, C. M., O’Connell, M. N., Mills, A., and Schroeder, C. E. (2007). Neuronal oscillations and multisensory interaction in primary auditory cortex. Neuron 53, 279–292.
Larson, M. A., and Stein, B. E. (1984). The use of tactile and olfactory cues in neonatal orientation and localization of the nipple. Dev. Psychobiol. 17, 423–436.
Laurienti, P. J., Burdette, J. H., Wallace, M. T., Yen, Y. F., Field, A. S., and Stein, B. E. (2002). Deactivation of sensory-specific cortex by cross-modal stimuli. J. Cogn. Neurosci. 14, 420–429.
Lewkowicz, D. J., and Kraebel, K. S. (2004). The value of multisensory redundancy in the development of intersensory perception. In The Handbook of Multisensory Processes, G. A. Calvert, C. Spence and B. E. Stein, eds (Cambridge, MA, The MIT Press), pp. 655–678.
Lickliter, R., and Bahrick, L. E. (2004). Perceptual development and the origins of multisensory responsiveness. In The Handbook of Multisensory Processes, G. A. Calvert, C. Spence and B. E. Stein, eds (Cambridge, MA, The MIT Press), pp. 643–654.
Liotti, M., Ryder, K., and Woldorff, M. G. (1998). Auditory attention in the congenitally blind: where, when and what gets reorganized? Neuroreport 9, 1007–1012.
Lovaas, O. I., Schreibman, L., and Koegel, R. L. (1974). A behavior modification approach to the treatment of autistic children. J. Autism Child. Schizophr. 4, 111–129.
Lovelace, C. T., Stein, B. E., and Wallace, M. T. (2003). An irrelevant light enhances auditory detection in humans: a psychophysical analysis of multisensory integration in stimulus detection. Brain Res. Cogn. Brain Res. 17, 447–453.
Macaluso, E., and Driver, J. (2004). Neuroimaging studies of cross-modal integration for emotion. In The Handbook of Multisensory Processes, G. A. Calvert, C. Spence and B. E. Stein, eds (Cambridge, MA, The MIT Press), pp 529–548.
Marks, L. E. (1978). The Unity of the Senses: Interrelations Among the Modalities. New York, NY, Academic Press.
Marks, L. E. (2004). Cross-modal interactions in speeded classification. In The Handbook of Multisensory Processes, G. A. Calvert, C. Spence and B. E. Stein, eds (Cambridge, MA, The MIT Press), pp. 85–106.
Maurer, D., Stager, C. L., and Mondloch, C. J. (1999). Cross-modal transfer of shape is difficult to demonstrate in one-month-olds. Child Dev. 70, 1047–1057.
Massaro, D. W. (2004). From multisensory integration to talking heads and language learning. In The Handbook of Multisensory Processes, G. A. Calvert, C. Spence and B. E. Stein, eds (Cambridge, MA, The MIT Press), pp. 153–176.
Meltzoff, A. N. (1990). Towards a developmental cognitive science: the implications of cross-modal matching and imitation for the development of representation and memory in infancy. Ann. NY Acad. Sci. 608, 1–31.
Meltzoff, A. N., and Borton, R. W. (1979). Intermodal matching by human neonates. Nature 282, 403–404.
Meredith, M. A., Clemo, H. R., and Stein, B. E. (1991). Somatotopic component of the multisensory map in the deep laminae of the cat superior colliculus. J. Comp. Neurol. 312, 353–370.
Meredith, M. A., and Stein, B. E. (1983). Interactions among converging sensory inputs in the superior colliculus. Science 221, 389–391.
Meredith, M. A., and Stein, B. E. (1986a). Spatial factors determine the activity of multisensory neurons in cat superior colliculus. Brain Res. 365, 350–354.
Meredith, M. A., and Stein, B. E. (1986b). Visual, auditory, and somatosensory convergence on cells in superior colliculus results in multisensory integration. J. Neurophysiol. 56, 640–662.
Meredith, M. A., and Stein, B. E. (1990). The visuotopic component of the multisensory map in the deep laminae of the cat superior colliculus. J. Neurosci. 10, 3727–3742.
Meredith, M. A., and Stein, B. E. (1996). Spatial determinants of multisensory integration in cat superior colliculus neurons. J. Neurophysiol. 75, 1843–1857.
Meredith, M. A., Wallace, M. T., and Stein, B. E. (1992). Visual, auditory and somatosensory convergence in output neurons of the cat superior colliculus: multisensory properties of the tecto-reticulo-spinal projection. Exp. Brain Res. 88, 181–186.
Middlebrooks, J. C., and Knudsen, E. I. (1984). A neural code for auditory space in the cat’s superior colliculus. J. Neurosci. 4, 2621–2634.
Miller, L. J., Reisman, J. E., McIntosh, D. N., and Simon, J. (2001). An ecological model of sensory modulation: performance of children with fragile X syndrome, autism, attention-deficit/hyperactivity disorder, and sensory modulation disorders. In Understanding the Nature of Sensory Integration With Diverse Populations, S. S. Roley, E. I. Blanche and R. C. Schaaf, eds (San Antonio, TX, Therapy Skill Builders), pp. 57–88.
Morrongiello, B. A., Lasenby, J., and Lee, N. (2003). Infants’ learning, memory, and generalization of learning for bimodal events. J. Exp. Child Psychol. 84, 1–19.
Munoz, D. P., and Wurtz, R. H. (1993a). Fixation cells in monkey superior colliculus. I. Characteristics of cell discharge. J. Neurophysiol. 70, 559–575.
Munoz, D. P., and Wurtz, R. H. (1993b). Fixation cells in monkey superior colliculus. II. Reversible activation and deactivation. J. Neurophysiol. 70, 576–589.
Neil, P. A., Chee-Ruiter, C., Scheier, C., Lewkowicz, D. J., and Shimojo, S. (2006). Development of multisensory spatial integration and perception in humans. Dev. Sci. 9, 454–464.
Newell, F. N. (2004). Cross-modal object recognition. In The Handbook of Multisensory Processes, G. A. Calvert, C. Spence and B. E. Stein, eds (Cambridge, MA, The MIT Press), pp. 123–140.
Newman, E. A., and Hartline, P. H. (1981). Integration of visual and infrared information in bimodal neurons in the rattlesnake optic tectum. Science 213, 789–791.
Partan, S. R. (2004). Multisensory animal communication. In The Handbook of Multisensory Processes, G. A. Calvert, C. Spence and B. E. Stein, eds (Cambridge, MA, The MIT Press), pp. 225–242.
Peck, C. K. (1987). Visual–auditory interactions in cat superior colliculus: their role in the control of gaze. Brain Res. 420, 162–166.
Putzar, L., Goerendt, I., Lange, K., Rösler, F., and Röder, B. (2007). Early visual deprivation impairs multisensory interactions in humans. Nat. Neurosci. 10, 1243–1245.
Recanzone, G. H. (1998). Rapidly induced auditory plasticity: the ventriloquism aftereffect. Proc. Natl. Acad. Sci. USA 95, 869–875.
Rowland, B. A., Jiang, W., and Stein, B. E. (2005). Long-term plasticity in multisensory integration. Soc. Neurosci. Abstr. 31, 505.8.
Rowland, B. A., Quessy, S., Stanford, T. R., and Stein, B. E. (2007). Multisensory integration shortens physiological response latencies. J. Neurosci. 27, 5879–5884.
Sathian, K. (2000). Practice makes perfect: sharper tactile perception in the blind. Neurology 54, 2203–2204.
Sathian, K. (2005). Visual cortical activity during tactile perception in the sighted and the visually deprived. Dev. Psychobiol. 46, 279–286.
Sathian, K., and Prather, S. C. Z. M. (2004). Visual cortical involvement in normal tactile perception. In The Handbook of Multisensory Processes, G. A. Calvert, C. Spence and B. E. Stein, eds (Cambridge, MA, The MIT Press), pp. 703–710.
Schaaf, R. C., and Miller, L. J. (2005). Occupational therapy using a sensory integrative approach for children with developmental disabilities. Ment. Retard. Dev. Disabil. Res. Rev. 11, 143–148.
Schroeder, C. E., and Foxe, J. J. (2002). The timing and laminar profile of converging inputs to multisensory areas of the macaque neocortex. Brain Res. Cogn. Brain Res. 14, 187–198.
Schroeder, C. E., and Foxe, J. J. (2004). Multisensory convergence in early cortical processing. In The Handbook of Multisensory Processes, G. A. Calvert, C. Spence and B. E. Stein, eds (Cambridge, MA, The MIT Press), pp. 295–310.
Schroeder, C. E., Lindsley, R. W., Specht, C., Marcovici, A., Smiley, J. F., and Javitt, D. C. (2001). Somatosensory input to auditory association cortex in the macaque monkey. J. Neurophysiol. 85, 1322–1327.
Senkowski, D., Talsma, D., Grigutsch, M., Herrmann, C. S., and Woldorff, M. G. (2007). Good times for multisensory integration: effects of the precision of temporal synchrony as revealed by gamma-band oscillations. Neuropsychologia 45, 561–571.
Shams, L., Kamitani, Y., and Shimojo, S. (2004). Modulations of visual perception by sound. In The Handbook of Multisensory Processes, G. A. Calvert, C. Spence and B. E. Stein, eds (Cambridge, MA, The MIT Press), pp. 27–34.
Sparks, D. L. (1986). Translation of sensory signals into commands for control of saccadic eye movements: role of primate superior colliculus. Physiol. Rev. 66, 118–171.
Sparks, D. L., and Nelson, J. S. (1987). Sensory and motor maps in the mammalian superior colliculus. Trends Neurosci. 10, 312–317.
Stein, B. E. (1984). Development of the superior colliculus. Annu. Rev. Neurosci. 7, 95–125.
Stein, B. E. (2005). The development of a dialogue between cortex and midbrain to integrate multisensory information. Exp. Brain Res. 166, 305–315.
Stein, B. E., and Arigbede, M.O. (1972). Unimodal and multimodal response properties of neurons in the cat’s superior colliculus. Exp. Neurol. 36, 179–196.
Stein, B. E., and Clamann, H. P. (1981). Control of pinna movements and sensorimotor register in cat superior colliculus. Brain Behav. Evol. 19, 180–192.
Stein, B. E., and Dixon, J. P. (1979). Properties of superior colliculus neurons in the golden hamster. J. Comp. Neurol. 183, 269–284.
Stein, B. E., and Gallagher, H. L. (1981). Maturation of cortical control over superior colliculus cells in cat. Brain Res. 223, 429–435.
Stein, B. E., Jiang, W., and Stanford, T. R. (2004). Multisensory integration in single neurons in midbrain and cortex. In The Handbook of Multisensory Processes, G. Calvert, C. Spence and B. E. Stein, eds (Cambridge, MA, The MIT Press), pp. 243–264.
Stein, B. E., Labos, E., and Kruger, L. (1973a). Sequence of changes in properties of neurons of superior colliculus of the kitten during maturation. J. Neurophysiol. 36, 667–679.
Stein, B. E., Labos, E., and Kruger, L. (1973b). Determinants of response latency in neurons of superior colliculus in kittens. Neurophysiol 36, 680–689.
Stein, B. E., London, N., Wilkinson, L. K., and Price, D. D. (1996). Enhancement of perceived visual intensity by auditory stimuli: a psychophysical analysis. J. Cogn. Neurosci. 8, 497–506.
Stein, B. E., Magalhaes-Castro, B., and Kruger, L. (1975). Superior colliculus: visuotopic-somatotopic overlap. Science 189, 224–226.
Stein, B. E., Magalhaes-Castro, B., and Kruger, L. (1976). Relationship between visual and tactile representations in cat superior colliculus. J. Neurophysiol. 39, 401–419.
Stein, B. E., and Meredith, M. A. (1993). The Merging of the Senses. Cambridge, MA, The MIT Press.
Stein, B. E., Meredith, M. A., Huneycutt, W. S., and McDade, L. (1989). Behavioral indices of multisensory integration: orientation to visual cues is affected by auditory stimuli. J. Cogn. Neurosci. 1, 12–24.
Stein, B. E., Meredith, M. A., and Wallace, M. T. (1993). The visually responsive neuron and beyond: multisensory integration in cat and monkey. Prog. Brain Res. 95, 79–90.
Stein, B. E., Perrault, T. J. Jr., Vaughan, J. W., and Rowland, B. A. (2008). Long term plasticity of multisensory neurons in the superior colliculus. Soc. Neurosci. Abstr. 34, 457.14.
Stein, B. E., and Rowland, B. A. (2007). The critical role of cortico-collicular interactions in the development of multisensory integration. Soc. Neurosci. Abstr. 33, 614.7.
Stein, B. E., and Stanford, T. R. (2008). Multisensory integration: current issues from the perspective of the single neuron. Nat. Rev. Neurosci. 9, 255–266.
Stein, B. E., Wallace, M. W., Stanford, T. R., and Jiang, W. (2002). Cortex governs multisensory integration in the midbrain. Neuroscientist 8, 306–314.
Streri, A., and Gentaz, E. (2003). Cross-modal recognition of shape from hand to eyes in human newborns. Somatosens. Mot. Res. 20, 13–18.
Sugihara, T., Diltz, M. D., Averbeck, B. B., and Romanski, L. M. (2006). Integration of auditory and visual communication information in the primate ventrolateral prefrontal cortex. J. Neurosci. 26, 11138–11147.
Sumby, W. H., and Pollack, I. (1954). Visual contribution to speech intelligibility in noise. J. Acoust. Soc. Am. 26, 212–215.
Talsma, D., Doty, T. J., Strowd, R., and Woldorff, M. G. (2006a). Attentional capacity for processing concurrent stimuli is larger across sensory modalities than within a modality. Psychophysiology 43, 541–549.
Talsma, D., Kok, A., and Ridderinkhof, K. R. (2006b). Selective attention to spatial and non-spatial visual stimuli is affected differentially by age: effects on event-related brain potentials and performance data. Int. J. Psychophysiol. 62, 249–261.
Talsma, D., Doty, T. J., and Woldorff, M. G. (2007). Selective attention and audiovisual integration: is attending to both modalities a prerequisite for early integration? Cereb. Cortex 17, 679–690.
Tucha, O., Walitza, S., Mecklinger, L., Sontag, T. A., Kubber, S., Linder, M., and Lange, K. W. (2006). Attentional functioning in children with ADHD – predominantly hyperactive-impulsive type and children with ADHD – combined type. J. Neural Transm. 113, 1943–1953.
Wallace, M. T. (2004). The development of multisensory integration. In The Handbook of Multisensory Processing, G. A. Calvert, C. Spence and B. E. Stein, eds (Cambridge, MA, The MIT Press), pp. 625–642.
Wallace, M. T., Carriere, B. N., Perrault, T. J. Jr., Vaughan, J. W., and Stein, B. E. (2006). The development of cortical multisensory integration. J. Neurosci. 26, 11844–11849.
Wallace, M. T., Meredith, M. A., and Stein, B. E. (1992). Integration of multiple sensory modalities in cat cortex. Exp. Brain Res. 91, 484–488.
Wallace, M. T., Meredith, M. A., and Stein, B. E. (1993). Converging influences from visual, auditory, and somatosensory cortices onto output neurons of the superior colliculus. J. Neurophysiol. 69, 1797–1809.
Wallace, M. T., and Stein, B. E. (1994). Cross-modal synthesis in the midbrain depends on input from cortex. J. Neurophysiol. 71, 429–432.
Wallace, M. T., and Stein, B. E. (1997). Development of multisensory neurons and multisensory integration in cat superior colliculus. J. Neurosci. 17, 2429–2444.
Wallace, M. T., and Stein, B. E. (2000). Onset of cross-modal synthesis in the neonatal superior colliculus is gated by the development of cortical influences. J. Neurophysiol. 83, 3578–3582.
Wallace, M. T., and Stein, B. E. (2001). Sensory and multisensory responses in the newborn monkey superior colliculus. J. Neurosci. 21, 8886–8894.
Wallace M. T., and Stein, B. E. (2007). Early experience determines how the senses will interact. J. Neurophysiol. 97, 921–926.
Wallace, M. T., Perrault, T. J. Jr., Hairston, W. D., and Stein, B. E. (2004a). Visual experience is necessary for the development of multisensory integration. J. Neurosci. 24, 9580–9584.
Wallace, M. T., Ramachandran, R., and Stein, B. E. (2004b). A revised view of sensory cortical parcellation. Proc. Natl. Acad. Sci. USA 101, 2167–2172.
Wallace, M. T., Wilkinson, L. K., and Stein, B. E. (1996). Representation and integration of multiple sensory inputs in primate superior colliculus. J. Neurophysiol. 76, 1246–1266.
Weisser, V., Stilla, R., Peltier, S., Hu, X., and Sathian, K. (2005). Short-term visual deprivation alters neural processing of tactile form. Exp. Brain Res. 166, 572–582.
Werner, H. (1973). Comparative Psychology of Mental Development. New York, NY, International Universities Press.
Wickelgren, B. G., and Sterling, P. (1969). Influence of visual cortex on receptive fields in the superior colliculus of the cat. J. Neurophysiol. 32, 1–15.
Wilkinson, L. K., Meredith, M. A., and Stein, B. E. (1996). The role of anterior ectosylvian cortex in cross-modality orientation and approach behavior. Exp. Brain Res. 112, 1–10.
Woldorff, M. G., Hazlett, C. J., Fichtenholtz, H. M., Weissman, D. H., Dale, A. M., and Song, A. W. (2004). Functional parcellation of attentional control regions of the brain. J. Cogn. Neurosci. 16, 149–165.
Woods, T. M., and Recanzone, G. H. (2004). Cross-modal interactions evidenced by the ventriloquism effect in humans and monkeys. In The Handbook of Multisensory Processes, G. A. Calvert, C. Spence and B. E. Stein, eds (Cambridge, MA, The MIT Press), pp. 35–48.
Wurtz, R. H., and Albano, J. E. (1980). Two visual systems: brain mechanisms for localization and discrimination are dissociated by tectal and cortical lesions. Annu. Rev. Neurosci. 3, 189–226.
Wurtz, R. H., and Goldberg, M. E. (1971). Superior colliculus cell responses related to eye movements in awake monkeys. Science 171, 82–84.
Zahar, Y., Reches, A., and Gutfreund, Y. (2009). Multisensory enhancement in the optic tectum of the barn owl: spike count and spike timing. J. Neurophysiol. 101, 2380–2394.
Zangaladze, A., Epstein, C. M., Grafton, S. T., and Sathian, K. (1999). Involvement of visual cortex in tactile discrimination of orientation. Nature 401, 587–590.
Zwiers, M. P., Van Opstal, A. J., and Paige, G. D. (2003). Plasticity in human sound localization induced by compressed spatial vision. Nat. Neurosci. 6, 175–181.
Keywords:
multisensory, integration, plasticity, sensory processing disorder, superior colliculus
Citation:
Stein BE, Perrault TJ Jr., Stanford TR and Rowland BA (2009). Postnatal experiences influence how the brain integrates information from different senses. Front. Integr. Neurosci. 3:21. doi: 10.3389/neuro.07.021.2009
Received:
09 April 2009;
 Paper pending published:
25 May 2009;
Accepted:
11 August 2009;
 Published online:
30 September 2009.

Edited by:

John J. Foxe, Nathan S. Kline Institute for Psychiatric Research, USA; City College of the City University of New York, USA

Reviewed by:

Andrew Bremner, University of London, UK
Adrian Rodriguez-Contreras, City College of New York, USA
Copyright:
© 2009 Stein, Perrault Jr., Stanford and Rowland. This is an open-access article subject to an exclusive license agreement between the authors and the Frontiers Research Foundation, which permits unrestricted use, distribution, and reproduction in any medium, provided the original authors and source are credited.
*Correspondence:
Barry E. Stein, Department of Neurobiology and Anatomy, Wake Forest University School of Medicine, Winston-Salem, NC 27157, USA. e-mail: bestein@wfubmc.edu

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.