- Department of Psychology, Toronto Metropolitan University, Toronto, ON, Canada
Sensory loss induces adaptive neural changes in the remaining non-deprived senses, known as cross-modal plasticity. Recent proposals of cross-modal plasticity suggest that it is a top-down, dynamic phenomenon that can occur through the lifespan and is initiated whether sensory loss is total (as in blindness or deafness) or more subtle (mild-to-moderate partial sensory loss). However, it is unclear whether adaptive plasticity differs between total and partial loss, because there is less research on the latter condition. Here we reviewed neuroimaging research on cross-modal plasticity in adult humans with mild-to-moderate hearing loss and compared it to three claims derived from deafness research. First, cross-modal plasticity is thought to involve both intra-modal strengthening of remaining senses and cross-modal recruitment of deprived cortical areas, but we were unable to identify strong evidence of cross-modal recruitment in adults with partial hearing loss. Second, cross-modal plasticity is believed to arise through top-down connections and implicates cognitive function, which agreed with our findings. Third, cross-modal plasticity is believed to enhance perception in the non-deprived senses. No study in our review supported this claim, but it is possible that cross-modal plasticity in partial hearing loss results in stronger modulation of auditory function by intact senses. In addition, many study outcomes in humans with partial loss were inconsistent. Overall, it may be premature to conclude that cross-modal plasticity operates the same for partial and total forms of sensory loss in humans, and we provide several recommendations for testing these claims in future research.
1 Introduction
Cross-modal plasticity describes when sensory loss, such as blindness or deafness, induces adaptive neuroplastic changes in sensory areas of the cerebral cortex (Bavelier and Neville, 2002). Two forms of cross-modal plasticity across are intra-modal compensation and cross-modal recruitment (Ewall et al., 2021). Intra-modal compensation is the strengthening of neural processing for the spared senses, for instance, when auditory cortex increases sensitivity to sound stimulation in blindness (Sadato et al., 1996). Cross-modal recruitment is when activity of the spared senses engages brain regions of the deprived senses. For example, visual stimulation activates deaf auditory cortex more than in hearing (Finney et al., 2001), and visual cortex is more strongly activated by sound in blindness (Kujala et al., 1997). Experimental studies demonstrate that cross-modal plasticity causally explains enhanced or “supra-normal” perceptual abilities that develop after sensory loss (Lomber et al., 2010; Meredith et al., 2011) and likely accounts for improved visual motion detection in deafness (Gougoux et al., 2004; Vetter et al., 2020; Voss, 2016). Overall, cross-modal plasticity research shows that the brain adapts to sensory loss by strengthening its existing sensory functions and using available neural resources left by the deprived sense.
Early research on cross-modal recruitment in congenital blindness or deafness led to claims that deprived sensory regions can undergo extensive structural and functional reorganization (Kral, 2007; Makin and Krakauer, 2023). In light of newer evidence, Kral and Sharma (2023) recently proposed that cross-modal plasticity is a more flexible and dynamic phenomenon that does not require dramatic functional rewiring of sensory cortex as was previously believed. Instead, cross-modal recruitment operates by up- and down-regulating preexisting neural connections (Makin and Krakauer, 2023), especially in non-primary regions of sensory cortex that receive modulatory input from external senses. For instance, sensory neurons that are deprived of stimulation can upscale their synaptic inputs via homeostatic plasticity so that they maintain a desired input–output level (Turrigiano, 2008), and synaptic scaling may include inputs from non-deprived senses. Sensory regions are thought to maintain their functional roles because there is no extensive reorganization, but these regions can switch their drive from a deprived sense to a remaining sense because of upregulation. Further, these inputs from non-deprived senses to sensory-deprived regions originate from higher-level association areas implicated in multimodal object perception and cognition (Ghazanfar and Schroeder, 2006), implying that cross-modal plasticity is a top-down process and draws from cognitive resources (Kral and Sharma, 2023).
Kral and Sharma’s (2023) proposal also explains how cross-modal plasticity can arise after partial sensory loss, a sensory status between the absence or near-absence of a sense (what is meant herein by deafness and blindness) and typical sensory function. The development of cross-modal plasticity through existing neural connections means that sensory cortex can adjust to continuous changes in sensory experience, even with mild sensory losses in adulthood. This claim was mainly supported by selected studies on visual cross-modal plasticity after mild-to-moderate hearing loss in middle-aged and older adult humans (Campbell and Sharma, 2014; Glick and Sharma, 2020) as well as noise-exposed animals (Schormans et al., 2017). Treatment of hearing loss with hearing aids also appears to reverse cross-modal plasticity (Glick and Sharma, 2020), affirming that cross-modal plasticity is dynamic and bidirectional. Despite their theoretical importance, studies on visual cross-modal plasticity in human adults with partial hearing loss have not been reviewed in full and weighed against claims made by cross-modal frameworks, including Kral and Sharma (2023). This is noteworthy because mild-to-moderate hearing loss is highly common, especially in aging, and is considerably more prevalent than deafness in humans (Haile et al., 2021). Yet, how sensory cortex adapts under with mild hearing loss is not well described. To fill this gap, we conducted a literature review on neuroimaging studies on visual processing in adults with partial hearing loss. Specifically, we aimed to evaluate three claims based on deafness research: (1) Is there clear evidence of cross-modal recruitment separate from intra-modal compensation in partial hearing loss? (2) Are there top-down or cognitive influences on cross-modal neural activity in partial hearing loss? (3) Is cross-modal plasticity in partial hearing loss linked to any measurable behavioral or perceptual benefits?
2 Results and discussion
We used search strings visu*, hearing loss, hearing impair*, cross-modal, and plasticity in Web of Science (N = 387), PubMed (N = 155), and PsycNET (N = 74), which yielded 616 total entries. We included studies that used visual-only stimulation or measured visual activity in cortical areas alongside a visual task. We excluded studies on multisensory integration [reviewed elsewhere by Brooks et al. (2018)], unless these studies reported neural responses to visual-only stimulation. Further, we excluded studies that did not have a neuroimaging component, because neural evidence was required to evaluate cross-modal plasticity claims [see Choi et al. (2024) for a review of visual behaviors in hearing loss]. Finally, we excluded studies on cochlear implant users because their hearing loss is commonly profound, and cross-modal effects in cochlear implant users with residual hearing function are not often distinguished from cases of total deafness. Duplicates and entries that did not match the above criteria were removed from search results, resulting in 15 studies for review shown in Table 1.
2.1 General summary
Thirteen of the reviewed studies in Table 1 included middle-aged to older adults with moderate or mild-to-moderate hearing loss. Liang et al. (2020) recruited moderate-to-severe hearing loss in younger to middle-aged adults, and Paul et al. (2025), recruited younger and older adults with only high-frequency hearing loss. Eight studies examined neural activity during passive visual stimulation, and seven included active behavioral tasks. Eleven studies used electroencephalography (EEG) to measure visual evoked potentials or neural oscillations. Four studies measured hemodynamic responses and functional connectivity using functional magnetic resonance imaging (fMRI) or functional near-infrared spectroscopy (fNIRS).
The visual evoked potential (VEP) was the most common technique used to measure cross-modal plasticity in humans with partial hearing loss. The VEP complex consists of ordered components labeled P1, N1, P2, N2, etc., that represent the maxima and minima of phase-locked voltage fluctuations along the visual cortical processing pathway. VEPs are characterized by their amplitude and latency. When measured by scalp EEG, VEP components reflect a composite of multiple underlying cortical generators that sum at the surface sensors (Arroyo et al., 1997; Freunberger et al., 2007). Seven of eleven EEG studies reported correlations between hearing loss and VEPs. Several studies claim that visual cross-modal plasticity occurs when hearing loss coincides with larger VEP amplitudes and shorter VEP latencies, which purportedly indicate more sensitive or efficient visual processing. Shown in Table 2, this configuration of results was observed in 16 cases (i.e., results of an analysis). However, study outcomes were highly inconsistent. The opposite effect, where smaller amplitudes or longer latencies occurred with increasing hearing loss, were reported in 8 instances, and 24 sets of analysis across VEP components were not significant. Effects on the P2 component were the least consistent in direction and spatial location, while effects on the P1 and amplitude of N1 were more consistent.
Table 2. Statistically significant effects of hearing loss on visual cortical potentials in EEG studies.
Multiple factors potentially explain inconsistent correlations between VEP components and hearing loss. VEPs were evoked by several types of stimuli, including visual speech (Aguiar et al., 2025), motion (Aguiar et al., 2025; Campbell and Sharma, 2014, 2020; Glick and Sharma, 2020), written words (Paul et al., 2025) shapes (Loughrey et al., 2023; Zhu et al., 2024), reversing checkerboards (Seol et al., 2024) and photographs (Liang et al., 2020), which could change VEP morphology because they are processing along separate anatomical pathways (Harter et al., 1982). Some studies included an active task, and attention and memory demands are known to modulate VEP amplitudes and latencies (Adam and Collins, 1978; Eason, 1981). Participant-level factors, such as age degree of hearing ability, and cognitive function (Glick and Sharma, 2020) also add to variability to the data and could affect correlations between hearing sensitivity measures and VEP features. It is also possible that the association between hearing loss and neural activity indexed by VEPs is not sufficiently robust for evaluating claims about cross-modal plasticity. Beyond VEP analysis, both Shende et al. (2024) and Aguiar et al. (2025) analyzed time-frequency representations of EEG data and found increased theta or alpha oscillation synchronization occurring 100–200 ms after the onset of visual stimulation.
Neural activation and functional connectivity recorded with fMRI was less frequently used to measure cross-modal responses. Rosemann and Thiel (2018) found increased activation in frontal areas to visual speech in participants with hearing loss, and frontal activation in medial, middle, and inferior frontal gyrus was correlated with an increased visual bias in an audiovisual speech task. However, there was no evidence for visual cross-modal recruitment of auditory cortex. Two other fMRI studies did not find effects of hearing loss on neural activation to visual images or visual motion (Puschmann and Thiel, 2017; Rosemann and Thiel, 2020).
All but two studies reviewed in Table 1 used cross-sectional and correlational designs. The longitudinal study conducted by Glick and Sharma (2020) measured VEPs in older adults with age-related hearing loss before and after 6 months of hearing aid use. VEP P1, N1, and P2 latencies over right temporal cortex were shorter in the hearing loss group compared to a control group prior to provision of hearing aids, but latencies returned to levels comparable to typical hearing adults following the intervention. This suggests that visual cross-modal plasticity can be reversed by hearing treatment. In a complementary study, Mai et al. (2024) used fNIRS to measure hemodynamic responses to visual checkerboard stimulation in people with hearing loss, before and after 4 weeks of speech-in-noise training. There were no effects on auditory cortical activation, but training decreased functional connectivity between auditory, frontal, parietal and temporo-parietal areas, with this effect disappearing after 4 weeks. The decrease in frontotemporal connectivity suggested reduced visual cross-modal recruitment because of increased auditory experience.
2.2 Claim 1: cross-modal recruitment in partial hearing loss
We first examined evidence for visual cross-modal recruitment of auditory cortex in partial hearing loss. Research in animal models shows that visual recruitment occurs in higher-order, non-primary auditory cortex (Lomber et al., 2010; Meredith et al., 2011). Here, neurons are innervated by pre-existing “heteromodal” visual or somatosensory synaptic inputs that typically modulate auditory processing in an inhibitory fashion and affect the phase of ongoing neuronal oscillations (Kayser et al., 2008; Morrill and Hasenstaub, 2018). Kral and Sharma (2023) propose that deafness reduces input to higher-order cortical neurons, and homeostatic mechanisms cause heteromodal inputs to strengthen and transition from a modulatory to a driving role. This raises visual activity levels in higher-order auditory regions and results in cross-modal recruitment. Neural activity recorded from a rat model of noise-induced hearing loss also suggests that cross-modal recruitment can occur with partial hearing decline. Noise-exposed animals show an increase in the number of neurons responding to visual and audiovisual stimulation in higher-order auditory and multisensory cortex compared to typical hearing control rats (Schormans et al., 2017; Schormans and Allman, 2024). These data also show evidence for a transitional shift in the functional border between auditory and visual areas across sensory cortex, where auditory-responding regions recede and visual-responding regions expand. The expansion of visual-responding areas across sensory cortex with cross-modal recruitment might be expected to arise in human neuroimaging research as increased activity in auditory temporal regions anterior to visual cortex.
VEPs measured by scalp EEG cannot definitively resolve neural sources because volume conduction through skin and other tissues distorts the distribution of neuroelectric potentials. Underlying neural generators can only be approximated by using spatial filtering and modeling of tissue conductance. Of seven EEG studies that employed source analysis, two found that hearing loss was associated with increased right-sided or bilateral temporal cortex activation, consistent with evidence of cross-mod recruitment (Aguiar et al., 2025; Paul et al., 2025). Four additional studies using source analysis to resolve VEP generators found descriptive evidence for activation of temporal cortical areas (Campbell and Sharma, 2014, 2020; Glick and Sharma, 2020; Liang et al., 2020), but these studies did not statistically compare these activations across hearing loss and typical hearing groups. One additional study did not find statistically significant differences in auditory cortical source activation between hearing loss and typical hearing groups (Stropahl and Debener, 2017). Notably, no fMRI or fNIRS study in this review showed evidence for cross-modal recruitment. It is possible that visual cross-modal recruitment does not alter metabolic demands in auditory or multi-sensory regions, but this claim cannot be inferred from null results.
Taken together, only a few EEG studies showed statistical evidence for cross-modal recruitment in adult humans with partial hearing loss, while others only showed qualitative differences in activation. Note that source analysis in EEG studies does not rule out the possibility for “leakage” of intra-modal activity into the source activity. If the observed changes in VEP components with hearing loss result from cross-modal plasticity, then it is likely that intra-modal plasticity plays a substantial role. Resting-state fMRI analysis have shown increased resting-state functional connectivity in visual subnetworks in groups with hearing loss (Ponticorvo et al., 2022) and increased connectivity between visual and cognitive areas (Lian et al., 2025). These studies were not included in our main review because no visual stimulation was used, but they suggest that modulation of VEPs by hearing loss can arise through intra-modal compensation.
2.3 Claim 2: top-down influence on cross-modal plasticity
The heteromodal inputs into non-primary auditory cortex through which visual cross-modal recruitment is proposed to develop are top-down modulatory connections originating high-level multimodal regions implicated in cognition (Ghazanfar and Schroeder, 2006). With sensory deprivation, these top-down hetero-modal connections are proposed to upregulate through homeostatic compensatory plasticity and become the driving input, allowing these regions to perform their functional roles (Kral and Sharma, 2023). Engagement of cognitive neuromodulatory systems such as the basal forebrain cholinergic system are top-down and experience-dependent and have also been shown to gate cortical sensory plasticity after sensory loss (Ramanathan et al., 2009). In humans with deafness, visual cognitive tasks activate temporal cortex, providing evidence for top-down involvement in cross-modal plasticity (Cardin et al., 2018). In partial hearing loss, recruitment of frontal brain regions has been observed during auditory speech tasks and is hypothesized to reflect a compensatory increase in resources for cognitive functions such as effort (Pichora-Fuller et al., 2016) or working memory (Rönnberg et al., 2010). Cross-modal plasticity may also draw from compensatory cognitive resources.
Three studies in Table 1 showed evidence for frontal activation when visual stimulation was used. One used fMRI to show increased frontal activation when participants viewed visual sentences (Rosemann and Thiel, 2018), and two showed descriptively higher frontal activation sourced from VEPs evoked by a motion stimulus (Campbell and Sharma, 2020; Glick and Sharma, 2020). Another study did not find increased frontal activity in hearing loss when using a visual moving dot stimulus but did find higher resting state connectivity between visual and frontal regions (Puschmann and Thiel, 2017). In 10 adults with partial hearing loss, Mai et al. (2024) used fNIRS to show that speech-in-noise training reduced frontotemporal connectivity during visual stimulation. This suggested that increased auditory activity levels attenuated cross-modal activity patterns. One final fMRI study that presented faces and images in a working memory task found no differences in frontal activation between hearing loss and control groups (Rosemann and Thiel, 2020). Therefore, with some exception, these studies agree with claims that cross-modal plasticity arises through top-down connections and implicates frontal cognitive brain networks.
Some studies in Table 1 were not conducted to test for cross-modal plasticity and were instead designed to identify evidence of cognitive decline. Hearing loss has been reliably shown to increase the risk of dementia and cognitive impairment (Deal et al., 2017; Lin et al., 2011, 2023) although no causal relationship has been established. One hypothesis suggests that hearing loss increases the demand on cognitive resources, which may include strategies to increase reliance on visual perception. However, long-term demands of cognitive resource use may degrade cognitive flexibility and function, resulting in or exacerbating cognitive impairment and dementia (Slade et al., 2020). Some EEG studies in Table 1 used visual stimulation in attention and memory tasks to measure cognitive ability, because auditory stimulation would be affected by hearing loss. Zhu et al. (2024) found worse visual attention performance correlated with smaller cortical N2 responses in older adults with partial hearing loss, consistent with cognitive decline. Loughrey et al. (2023) reported smaller amplitudes for the P3 and late positive potential in a group of adults with hearing loss, but no difference was found for performance on a working memory task. Loughrey et al. (2023) also found that the earlier cortical potentials P1 and N1 were higher in amplitude in the hearing loss group, which was interpreted as cross-modal plasticity. These data may suggest that cross-modal plasticity can occur with cognitive-decline, but the absence of behavioral differences casts doubt on this interpretation.
Clarifying evidence was provided by Glick and Sharma (2020) who conducted a longitudinal study to show that several months of hearing aid use reversed EEG evidence of cross-modal plasticity. The authors also tracked how cognitive function changed before and after the intervention. At baseline, VEP evidence of cross-modal plasticity in participants with hearing loss was correlated with worse performance on multiple cognitive tasks including processing speed and working memory. After 6 months of hearing aid use and the reversal cross-modal plasticity, performance on the cognitive tasks improved. One interpretation is that cross-modal plasticity exhausted cognitive reserves and lowered cognitive performance, which could be alleviated with hearing restoration. However, it is also possible that cognitive effects and cross-modal VEP plasticity are independent and were indirectly related through their correlation to hearing loss.
Both cognitive decline and partial hearing loss are more common in older age, but little is known about age effects on cross-modal plasticity. Studies in Table 1 often include older adults, but do not commonly attempt to model age-related effects separately from hearing loss effects. EEG studies show that VEP latencies increase with age, especially for the P1 response (Alain et al., 2022; Price et al., 2017; Shaw and Cant, 1980), but VEP latencies are expected to decrease with visual cross-modal plasticity due to increased visual efficiency (Campbell and Sharma, 2014). To address this, both Aguiar et al. (2025) and Paul et al. (2025) examined effects of participant age, respectively, using visual speech or visual text. Interestingly, both studies found statistically independent and inverse effects of age and hearing loss on the visual P2 response. Aguiar et al. (2025) found shorter latencies with hearing loss and longer latencies with age, whereas Paul et al. (2025) found the opposite: shorter latencies with age, and longer latencies with hearing loss. As mentioned, stimulus differences and task requirements could account for the inconsistent direction of these correlations, but these studies nonetheless show evidence for diverging effects of age and cross-modal plasticity on visual processing.
2.4 Claim 3: cross-modal plasticity and changes to visual behaviors
Cross-modal recruitment is believed to be responsible for enhanced perceptual abilities found in deaf or blind populations. Improvement in perceptual ability is thought to be constrained to functions that are shared between senses. This is because cross-modal recruitment develops through pre-existing neural connections, allowing sensory cortical areas retain their functional role when switching their response preference from a deprived sense to a remaining sense, which overall supports “supra-modal” perception of objects or events (Kral and Sharma, 2023). Deprived regions do not take on new functions that are specific to remaining senses. In agreement, visual “unimodal” functions, such as color and contrast perception and foveal acuity, and contrast, do not appear to improve with deafness [reviewed in Alencar et al. (2018)] as there is no auditory analogue. On the other hand, visual object motion detection and peripheral object tracking, which are both supported by auditory and visual systems, are enhanced in deaf humans (Codina et al., 2011; Shiell et al., 2014).
Causal evidence for a relationship between cross-modal plasticity and enhanced visual ability comes from the congenitally deaf cat. In the deaf cat auditory cortex, Lomber et al. (2010) found cross-modal recruitment in higher-order areas including the posterior auditory field (PAF) and dorsal zone (DZ). In hearing cats, these areas are, respectively, implicated in auditory localization and auditory motion detection. Congenitally deaf cats also show evidence for enhanced visual localization and visual motion detection. Lomber et al. (2010) used cooling loops to temporarily deactivate PAF and DZ. PAF deactivation abolished deaf cats’ visual localization advantage but not visual motion detection, and DZ deactivation abolished visual motion detection but not visual localization. This double dissociation provides strong causal evidence that cross-modal plasticity supports enhanced visual ability. Auditory cortical areas maintained their “supra-modal” role (object localization in PAF, and object motion in DZ) but utilized visual information delivered through heteromodal inputs to PAF and DZ to perform their function. A follow-up study in congenitally deaf cats confirmed this finding, showing that the auditory field of the anterior ectosylvian sulcus (FAES), a region normally involved in acoustic orienting in hearing cats, is visually reorganized in deaf cats. Temporary FAES deactivation using cooling loops caused visual orienting deficits in deaf cats, again showing a retention of function in cross-modally affected regions (Meredith et al., 2011).
However, no studies in our review showed enhanced visual function or a relationship between visual neural activity and behavioral performance on perceptual or cognitive tasks in partial adult-onset hearing loss (Table 1). In fact, Zhu et al. (2024) found worse visual attention in participants with poorer auditory function, which was interpreted as evidence of cognitive decline. While our review cannot corroborate the claim that cross-modal plasticity affects visual-only abilities in partial hearing loss, behavioral research shows evidence for stronger visual influences during audiovisual perception. Compared to people with typical hearing, those with hearing loss are more affected by visual distractors than auditory distractors (Puschmann et al., 2014), are more likely to experience illusory McGurk fusions of auditory and visual phonemes (Rosemann and Thiel, 2018; Stropahl and Debener, 2017) and show larger visual enhancements of neural speech tracking (Puschmann et al., 2019). In the brain, increased visual activation correlates to the susceptibility of experiencing McGurk illusions (Rosemann and Thiel, 2018), and partial hearing loss is associated with increased connectivity between auditory and visual areas during audiovisual stimulation (Puschmann and Thiel, 2017).
Together, these studies suggest that cross-modal plasticity in partial hearing loss is more likely to result in altered visual modulation of auditory function, rather than altered visual function per se. At the neural level, visual heteromodal inputs to higher-order auditory cortical neurons may increase in strength but remain modulatory (or a mixture of modulatory and driving) in partial hearing loss because of residual auditory activity. When auditory deprivation rises to deafness, the absence of auditory input permits heteromodal inputs to fully drive auditory cortical neurons (Kral and Sharma, 2023). Visual modulation of auditory function may also occur in multisensory cortex in partial hearing loss, as shown in animal models (Schormans et al., 2017; Schormans and Allman, 2024).
It does not appear that cross-modal plasticity in partial hearing loss is associated with enhanced audiovisual abilities as is found in deafness. For example, some studies show reduced or no change in audiovisual perception in mild-to-moderate hearing loss compared to typical-hearing controls (Gieseler et al., 2018; Tye-Murray et al., 2007). In addition, research showing more frequently experienced McGurk illusion in partial hearing loss (Rosemann and Thiel, 2018; Stropahl and Debener, 2017) does not mean the visual influence is beneficial for accurate perception, as is sometimes inferred. Visual events are in fact more distracting (Puschmann et al., 2014). In rats, noise-induced hearing loss impaired perception of timing for audiovisual events (Schormans and Allman, 2019). Therefore, the putative increase in visual modulation of auditory perception with partial hearing loss is not beneficial per se. It is also possible that cross-modal plasticity benefits perceptual ability only if these abilities strengthen through experience. In other words, the visual benefits consequent of cross-modal plasticity in partial hearing loss are not spontaneous, exposure and training are needed to improve target functions. For example, adults with mild-to-moderate hearing loss who purposefully rely on visual information as a coping strategy may accrue a visual benefit over time through cross-modal plasticity, but those without long-term experience will not.
3 Conclusions and future directions
The objective of this review was to determine whether cross-modal plasticity in partial hearing loss was consistent with its manifestation described in deafness research. At a general level, we found that hearing loss was associated with altered visual neural activity as predicted with cross-modal plasticity, but the direction and specificity of effects was inconsistent. The first of three claims we surveyed was if cross-modal recruitment was evident in partial hearing loss. We found only few studies in humans observed recruitment effects and these findings were derived from EEG techniques with less robust spatial resolution. Second, the reviewed studies agreed with claims that cross-modal plasticity develops through top-down connections and implicates cognitive function. Third, there was no clear visual perceptual benefit linked to cross-modal plasticity in partial hearing loss as is found in deafness, although it is plausible that plasticity with partial loss is associated with stronger visual modulation of auditory function. No study that was reviewed established that cross-modal plasticity was categorically different between partial and total forms of sensory loss, but major claims such as recruitment and modified perceptual ability were not fully corroborated. In addition, study findings in partial hearing loss were unstable or were not often replicated. Overall, there is evidence that visual cortical activity is modulated by hearing sensitivity and experience in humans, but without additional research it may be premature to conclude that it reflects cross-modal plasticity mechanisms known to occur in total sensory loss.
Future research can make several improvements to better test claims that the development of cross-modal plasticity relies on the same mechanisms in both total and partial sensory loss. First, research needs to establish reliable markers of cross-modal plasticity in human neuroimaging studies. For example, effects of hearing loss on VEPs were often identified but were not consistent across studies, even when the same stimulus was used. Multiple cross-modal plasticity studies in partial hearing loss and deafness use a stimulus where an image of a concentric circles in a contrast gradient is followed by another image where the circles are radially modulated into a star (Aguiar et al., 2025; Campbell and Sharma, 2014, 2016, 2020; Doucet et al., 2005, 2006; Fullerton et al., 2023; Glick and Sharma, 2020). The alternation of images is designed to give rise to the impression of apparent motion, a circle morphing into a star, etc., and motion perception has been shown to improve with cross-modal plasticity in deafness (Lomber et al., 2010). However, amplitude, latency, and scalp region effects across different data sets were inconsistent when this stimulus was used (Table 2). Cross-modal plasticity may also be stimulus-specific and sensitive to top-down demands, which creates challenges for developing a robust measure. A neural marker of cross-modal plasticity must have strong test–retest stability and also show the degree of cross-modal recruitment separately from intra-modal compensation Finally, a neural marker would ideally correspond to a measurable change in behavior, so that it is possible to falsify the claim that cross-modal plasticity drives perceptual improvement in cases of partial hearing loss.
A second recommendation is to increase use of longitudinal research designs. Only two studies used longitudinal or training designs in our review, and both suggested that altered visual cortical activity can diminish with hearing restoration or through auditory experience (Glick and Sharma, 2020; Mai et al., 2024). Little is known about how cross-modal plasticity evolves over the lifespan or with visual, not auditory, training. Addressing this gap would make it possible to reveal constraints on claims that cross-modal plasticity is flexible and dynamic. With respect to possible constraints, a third recommendation is to better understand how age and cognitive decline relate to cross-modal plasticity. Increased frontal recruitment accompanying visual cortical activity implies that it could draw from cognitive resources and contribute to cognitive decline. Do early stages of cognitive decline impair the development of cross-modal plasticity? Also, some of the reviewed studies suggest the development of cross-modal plasticity could be limited in older age. Understanding the limitations of cross-modal adaptation is essential for predicting potential perceptual benefits in groups with vulnerabilities.
A better understanding of cross-modal plasticity would prove useful in clinical settings. VEPs can potentially be used as an objective neural measure to track the efficacy of hearing aid use on sensory ability and cognitive function as a complement to behavioral testing (Glick and Sharma, 2020). Furthermore, training on lip reading is hypothesized to improve audiovisual speech perception, but the outcome of training for speech function in partial hearing loss is mixed (Bernstein et al., 2022; Schmitt et al., 2023). A better understanding of top-down cross-modal influences could help to design tasks or encourage coping strategies that maximize learning and uptake of visual information.
Author contributions
PA: Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Validation, Writing – review & editing. JP: Formal Analysis, Investigation, Validation, Writing – review & editing. BP: Conceptualization, Data curation, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Supervision, Writing – original draft, Writing – review & editing.
Funding
The author(s) declare that financial support was received for the research and/or publication of this article. This research was supported by the Natural Sciences and Engineering Research Council of Canada, awarded to B.T.P. (RGPIN/03752–2021).
Conflict of interest
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Generative AI statement
The authors declare that no Gen AI was used in the creation of this manuscript.
Any alternative text (alt text) provided alongside figures in this article has been generated by Frontiers with the support of artificial intelligence and reasonable efforts have been made to ensure accuracy, including review by the authors wherever possible. If you identify any issues, please contact us.
Publisher’s note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
References
Adam, N., and Collins, G. I. (1978). Late components of the visual evoked potential to search in short-term memory. Electroencephalogr. Clin. Neurophysiol. 44, 147–156. doi: 10.1016/0013-4694(78)90261-4
Aguiar, P. V., Williams, M. R., and Paul, B. T. (2025). Visual cortical responses in age-related hearing loss show evidence for compensatory neuroplasticity. [Preprint]
Alain, C., Chow, R., Lu, J., Rabi, R., Sharma, V. V., Shen, D., et al. (2022). Aging enhances neural activity in auditory, visual, and somatosensory cortices: the common cause revisited. J. Neurosci. 42, 264–275. doi: 10.1523/JNEUROSCI.0864-21.2021
Alencar, C. D. C., Butler, B. E., and Lomber, S. G. (2018). What and how the deaf brain sees. J. Cogn. Neurosci. 31, 1091–1109. doi: 10.1162/jocn_a_01425
Arroyo, S., Lesser, R. P., Poon, W. T., Robert, W., Webber, S., and Gordon, B. (1997). Neuronal generators of visual evoked potentials in humans: visual processing in the human cortex. Epilepsia 38, 600–610. doi: 10.1111/j.1528-1157.1997.tb01146.x
Bavelier, D., and Neville, H. J. (2002). Cross-modal plasticity: where and how? Nat. Rev. Neurosci. 3, 443–452. doi: 10.1038/nrn848
Bernstein, L. E., Auer, E. T., and Eberhardt, S. P. (2022). During lipreading training with sentence stimuli, feedback controls learning and generalization to audiovisual speech in noise. Am. J. Audiol. 31, 57–77. doi: 10.1044/2021_AJA-21-00034
Brooks, C. J., Chan, Y. M., Anderson, A. J., and McKendrick, A. M. (2018). Audiovisual temporal perception in aging: the role of multisensory integration and age-related sensory loss. Front. Hum. Neurosci. 12:192. doi: 10.3389/fnhum.2018.00192
Campbell, J., and Sharma, A. (2014). Cross-modal re-organization in adults with early stage hearing loss. PLoS One 9:e90594. doi: 10.1371/journal.pone.0090594
Campbell, J., and Sharma, A. (2016). Visual cross-modal re-organization in children with cochlear implants. PLoS One 11. doi: 10.1371/journal.pone.0147793
Campbell, J., and Sharma, A. (2020). Frontal cortical modulation of temporal visual cross-modal re-organization in adults with hearing loss. Brain Sci. 10:498. doi: 10.3390/brainsci10080498
Cardin, V., Rudner, M., De Oliveira, R. F., Andin, J., Su, M. T., Beese, L., et al. (2018). The organization of working memory networks is shaped by early sensory experience. Cereb. Cortex 28, 3540–3554. doi: 10.1093/cercor/bhx222
Choi, A., Kim, H., Jo, M., Kim, S., Joung, H., Choi, I., et al. (2024). The impact of visual information in speech perception for individuals with hearing loss: a mini review. Front. Psychol. 15:1399084. doi: 10.3389/fpsyg.2024.1399084
Codina, C., Pascalis, O., Mody, C., Toomey, P., Rose, J., Gummer, L., et al. (2011). Visual advantage in deaf adults linked to retinal changes. PLoS One 6:e20417. doi: 10.1371/journal.pone.0020417
Deal, J. A., Betz, J., Yaffe, K., Harris, T., Purchase-Helzner, E., Satterfield, S., et al. (2017). Hearing impairment and incident dementia and cognitive decline in older adults: the health ABC study. J. Gerontol. A Biol. Sci. Med. Sci. 72, 703–709. doi: 10.1093/gerona/glw069
Doucet, M. E., Bergeron, F., Lassonde, M., Ferron, P., and Lepore, F. (2006). Cross-modal reorganization and speech perception in cochlear implant users. Brain 129, 3376–3383. doi: 10.1093/brain/awl264
Doucet, M. E., Gosselin, F., Lassonde, M., Guillemot, J. P., and Lepore, F. (2005). Development of visual-evoked potentials to radially modulated concentric patterns. Neuroreport 16, 1753–1756. doi: 10.1097/01.wnr.0000185011.91197.58
Eason, R. G. (1981). Visual evoked potential correlates of early neural filtering during selective attention. Bull. Psychon. Soc. 18, 203–206. doi: 10.3758/BF03333604
Ewall, G., Parkins, S., Lin, A., Jaoui, Y., and Lee, H. K. (2021). Cortical and subcortical circuits for cross-modal plasticity induced by loss of vision. Front. Neural Circuits 15:665009. doi: 10.3389/fncir.2021.665009
Finney, E. M., Fine, I., and Dobkins, K. R. (2001). Visual stimuli activate auditory cortex in the deaf. Nat. Neurosci. 4, 1171–1173. doi: 10.1038/nn763
Freunberger, R., Klimesch, W., Doppelmayr, M., and Höller, Y. (2007). Visual P2 component is related to theta phase-locking. Neurosci. Lett. 426, 181–186. doi: 10.1016/j.neulet.2007.08.062
Fullerton, A. M., Vickers, D. A., Luke, R., Billing, A. N., Mcalpine, D., Hernandez-Perez, H., et al. (2023). Cross-modal functional connectivity supports speech understanding in cochlear implant users. Cereb. Cortex 33, 3350–3371. doi: 10.1093/cercor/bhac277
Ghazanfar, A. A., and Schroeder, C. E. (2006). Is neocortex essentially multisensory? Trends Cogn. Sci. 10, 278–285. doi: 10.1016/j.tics.2006.04.008
Gieseler, A., Tahden, M. A. S., Thiel, C. M., and Colonius, H. (2018). Does hearing aid use affect audiovisual integration in mild hearing impairment? Exp. Brain Res. 236, 1161–1179. doi: 10.1007/s00221-018-5206-6
Glick, H. A., and Sharma, A. (2020). Cortical neuroplasticity and cognitive function in early-stage, mild-moderate hearing loss: evidence of neurocognitive benefit from hearing aid use. Front. Neurosci. 14:93. doi: 10.3389/fnins.2020.00093
Gougoux, F., Lepore, F., Lassonde, M., Voss, P., Zatorre, R. J., and Belin, P. (2004). Pitch discrimination in the early blind. Nature 430:309. doi: 10.1038/430309a
Haile, L. M., Kamenov, K., Briant, P. S., Orji, A. U., Steinmetz, J. D., Abdoli, A., et al. (2021). Hearing loss prevalence and years lived with disability, 1990–2019: findings from the global burden of disease study 2019. Lancet 397, 996–1009. doi: 10.1016/S0140-6736(21)00516-X
Harter, M. R., Aine, C., and Schroeder, C. (1982). Hemispheric differences in the neural processing of stimulus location and type: effects of selective attention on visual evoked potentials. Neuropsychologia 20, 421–438. doi: 10.1016/0028-3932(82)90041-0
Kayser, C., Petkov, C. I., and Logothetis, N. K. (2008). Visual modulation of neurons in auditory cortex. Cereb. Cortex 18, 1560–1574. doi: 10.1093/cercor/bhm187
Kral, A. (2007). Unimodal and cross-modal plasticity in the “deaf” auditory cortex. Int. J. Audiol. 46, 479–493. doi: 10.1080/14992020701383027
Kral, A., and Sharma, A. (2023). Crossmodal plasticity in hearing loss. Trends Neurosci. 46, 377–393. doi: 10.1016/j.tins.2023.02.004
Kujala, T., Alho, K., Huotilainen, M., Ilmoniemi, R. J., Lehtokoski, A., Leinonen, A., et al. (1997). Electrophysiological evidence for cross-modal plasticity in humans with early- and late-onset blindness. Psychophysiology 34, 213–216. doi: 10.1111/j.1469-8986.1997.tb02134.x
Lian, W., Zhang, L., Wang, A., Huang, R., Zhang, H., Bao, X., et al. (2025). The default mode network and visual network functional connectivity changes in noise-induced hearing loss patients: a resting-state fMRI study. Brain Behav. 15:e70465. doi: 10.1002/brb3.70465
Liang, M., Liu, J., Cai, Y., Zhao, F., Chen, S., Chen, L., et al. (2020). Event-related potential evidence of enhanced visual processing in auditory-associated cortex in adults with hearing loss. Audiol. Neurotol. 25, 237–248. doi: 10.1159/000505608
Lin, F. R., Metter, E. J., O’Brien, R. J., Resnick, S. M., Zonderman, A. B., and Ferrucci, L. (2011). Hearing loss and incident dementia. Arch. Neurol. 68, 214–220. doi: 10.1001/archneurol.2010.362
Lin, F. R., Pike, J. R., Albert, M. S., Arnold, M., Burgard, S., Chisolm, T., et al. (2023). Hearing intervention versus health education control to reduce cognitive decline in older adults with hearing loss in the USA (achieve): a multicentre, randomised controlled trial. Lancet 402, 786–797. doi: 10.1016/S0140-6736(23)01406-X
Lomber, S. G., Meredith, M. A., and Kral, A. (2010). Cross-modal plasticity in specific auditory cortices underlies visual compensations in the deaf. Nat. Neurosci. 13, 1421–1427. doi: 10.1038/nn.2653
Loughrey, D. G., Jordan, C., Ibanez, A., Parra, M. A., Lawlor, B. A., and Reilly, R. B. (2023). Age-related hearing loss associated with differences in the neural correlates of feature binding in visual working memory. Neurobiol. Aging 132, 233–245. doi: 10.1016/j.neurobiolaging.2023.09.016
Mai, G., Jiang, Z., Wang, X., Tachtsidis, I., and Howell, P. (2024). Neuroplasticity of speech-in-noise processing in older adults assessed by functional near-infrared spectroscopy (fNIRS). Brain Topogr. 37, 1139–1157. doi: 10.1007/s10548-024-01070-2
Makin, T. R., and Krakauer, J. W. (2023). Against cortical reorganisation. eLife 12:e84716. doi: 10.7554/eLife.84716
Meredith, M. A., Kryklywy, J., McMillan, A. J., Malhotra, S., Lum-Tai, R., and Lomber, S. G. (2011). Crossmodal reorganization in the early deaf switches sensory, but not behavioral roles of auditory cortex. Proc. Natl. Acad. Sci. USA 108, 8856–8861. doi: 10.1073/pnas.1018519108
Morrill, R. J., and Hasenstaub, A. R. (2018). Visual information present in infragranular layers of mouse auditory cortex. J. Neurosci. 38, 2854–2862. doi: 10.1523/JNEUROSCI.3102-17.2018
Paul, B. T., Srikanthanathan, A., Daien, M., and Dimitrijevic, A. (2025). Association between high-frequency hearing sensitivity and visual cross-modal plasticity in typical-hearing adults. bioRxiv, 1–21. doi: 10.1101/2025.04.28.648945
Pichora-Fuller, M. K., Kramer, S. E., Eckert, M. A., Edwards, B., Hornsby, B. W. Y., Humes, L. E., et al. (2016). Hearing impairment and cognitive energy: the framework for understanding effortful listening (FUEL). Ear Hear. 37, 5S–27S. doi: 10.1097/aud.0000000000000312
Ponticorvo, S., Manara, R., Cassandro, E., Canna, A., Scarpa, A., Troisi, D., et al. (2022). Cross-modal connectivity effects in age-related hearing loss. Neurobiol. Aging 111, 1–13. doi: 10.1016/j.neurobiolaging.2021.09.024
Price, D., Tyler, L. K., Neto Henriques, R., Campbell, K. L., Williams, N., Treder, M. S., et al. (2017). Age-related delay in visual and auditory evoked responses is mediated by white-and grey-matter differences. Nat. Commun. 8:15671. doi: 10.1038/ncomms15671
Puschmann, S., Daeglau, M., Stropahl, M., Mirkovic, B., Rosemann, S., Thiel, C. M., et al. (2019). Hearing-impaired listeners show increased audiovisual benefit when listening to speech in noise. NeuroImage 196, 261–268. doi: 10.1016/j.neuroimage.2019.04.017
Puschmann, S., Sandmann, P., Bendixen, A., and Thiel, C. M. (2014). Age-related hearing loss increases cross-modal distractibility. Hear. Res. 316, 28–36. doi: 10.1016/j.heares.2014.07.005
Puschmann, S., and Thiel, C. M. (2017). Changed crossmodal functional connectivity in older adults with hearing loss. Cortex 86, 109–122. doi: 10.1016/j.cortex.2016.10.014
Ramanathan, D., Tuszynski, M. H., and Conner, J. M. (2009). The basal forebrain cholinergic system is required specifically for behaviorally mediated cortical map plasticity. J. Neurosci. 29, 5992–6000. doi: 10.1523/JNEUROSCI.0230-09.2009
Rönnberg, J., Rudner, M., Lunner, T., and Zekveld, A. (2010). When cognition kicks in: working memory and speech understanding in noise. Noise Health 12, 263–269. doi: 10.4103/1463-1741.70505
Rosemann, S., and Thiel, C. M. (2018). Audio-visual speech processing in age-related hearing loss: stronger integration and increased frontal lobe recruitment. NeuroImage 175, 425–437. doi: 10.1016/j.neuroimage.2018.04.023
Rosemann, S., and Thiel, C. M. (2020). Neural signatures of working memory in age-related hearing loss. Neuroscience 429, 134–142. doi: 10.1016/j.neuroscience.2019.12.046
Sadato, N., Pascual-Leone, A., Grafman, J., Ibañez, V., Deiber, M. P., Dold, G., et al. (1996). Activation of the primary visual cortex by braille reading in blind subjects. Nature 380, 526–528. doi: 10.1038/380526a0
Schmitt, R., Meyer, M., and Giroud, N. (2023). Improvements in naturalistic speech-in-noise comprehension in middle-aged and older adults after 3 weeks of computer-based speechreading training. NPJ Sci. Learn. 8:32. doi: 10.1038/s41539-023-00179-6
Schormans, A. L., and Allman, B. L. (2019). Compensatory plasticity in the lateral extrastriate visual cortex preserves audiovisual temporal processing following adult-onset hearing loss. Neural Plast. 2019, 1–20. doi: 10.1155/2019/7946987
Schormans, A. L., and Allman, B. L. (2024). Layer-specific enhancement of visual-evoked activity in the audiovisual cortex following a mild degree of hearing loss in adult rats. Hear. Res. 450:109071. doi: 10.1016/j.heares.2024.109071
Schormans, A. L., Typlt, M., and Allman, B. L. (2017). Crossmodal plasticity in auditory, visual and multisensory cortical areas following noise-induced hearing loss in adulthood. Hear. Res. 343, 92–107. doi: 10.1016/j.heares.2016.06.017
Seol, H. Y., Kang, S., Kim, S., Kim, J., Kim, E., Hong, S. H., et al. (2024). P1 and N1 characteristics in individuals with normal hearing and hearing loss, and Cochlear implant users: a pilot study. J. Clin. Med. 13:4941. doi: 10.3390/jcm13164941
Shaw, N. A., and Cant, B. R. (1980). Age-dependent changes in the latency of the pattern visual evoked potential. Electroencephalogr. Clin. Neurophysiol. 48, 237–241. doi: 10.1016/0013-4694(80)90310-7
Shende, S. A., Jones, S. E., and Mudar, R. A. (2024). Alpha and theta oscillations on a visual strategic processing task in age-related hearing loss. Front. Neurosci. 18:1382613. doi: 10.3389/fnins.2024.1382613
Shiell, M. M., Champoux, F., and Zatorre, R. J. (2014). Enhancement of visual motion detection thresholds in early deaf people. PLoS One 9:e90498. doi: 10.1371/journal.pone.0090498
Slade, K., Plack, C. J., and Nuttall, H. E. (2020). The effects of age-related hearing loss on the brain and cognitive function. Trends Neurosci. 43, 810–821. doi: 10.1016/j.tins.2020.07.005
Stropahl, M., and Debener, S. (2017). Auditory cross-modal reorganization in cochlear implant users indicates audio-visual integration. Neuroimage Clin. 16, 514–523. doi: 10.1016/j.nicl.2017.09.001
Turrigiano, G. G. (2008). The self-tuning neuron: synaptic scaling of excitatory synapses. Cell 135, 422–435. doi: 10.1016/j.cell.2008.10.008
Tye-Murray, N., Sommers, M. S., and Spehar, B. (2007). Audiovisual integration and lipreading abilities of older adults with normal and impaired hearing. Ear Hear. 28, 656–668. doi: 10.1097/AUD.0b013e31812f7185
Vetter, P., Bola, Ł., Reich, L., Bennett, M., Muckli, L., and Amedi, A. (2020). Decoding natural sounds in early “visual” cortex of congenitally blind individuals. Curr. Biol. 30, 3039–3044.e2. doi: 10.1016/j.cub.2020.05.071
Voss, P. (2016). Auditory spatial perception without vision. Front. Psychol. 7:1960. doi: 10.3389/fpsyg.2016.01960
Keywords: cross-modal plasticity, hearing loss, neuroimaging, visual perception, EEG, fMRI, fNIRS
Citation: Aguiar PV, Preman J and Paul BT (2025) Cross-modal neuroplasticity in partial hearing loss: a mini-review. Front. Neurosci. 19:1627888. doi: 10.3389/fnins.2025.1627888
Edited by:
Fatima T. Husain, University of Illinois at Urbana-Champaign, United StatesReviewed by:
Alessio Fracasso, University of Padua, ItalyRicky Kaplan Neeman, Tel Aviv University, Israel
Copyright © 2025 Aguiar, Preman and Paul. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Brandon T. Paul, YnRwYXVsQHRvcm9udG9tdS5jYQ==