AUTHOR=Chen Yi-Chuan , Ku Ang-Ke , Huang Pi-Chun TITLE=Examining auditory modulations on detecting and pooling visual global motion JOURNAL=Frontiers in Psychology VOLUME=Volume 16 - 2025 YEAR=2025 URL=https://www.frontiersin.org/journals/psychology/articles/10.3389/fpsyg.2025.1522618 DOI=10.3389/fpsyg.2025.1522618 ISSN=1664-1078 ABSTRACT=IntroductionMultisensory signals often interact to reduce perceptual uncertainty in the environment. However, the effects and mechanisms underlying audiovisual interactions in motion perception remain unclear. In this study, we adopted the method of constant stimuli and the equivalent noise paradigm to investigate whether and how auditory motion influences the perception of visual global motion.MethodsThe visual stimuli consisted of dots moving either up-left or up-right, with motion directions sampled from a normal distribution at five levels of standard deviation. The auditory stimuli were white noise moving either laterally (leftward or rightward; Experiment 1) or diagonally (up-left or up-right; Experiment 2), forming a coarse congruent or incongruent directional relationship with the visual motion trajectories. Stationary and no-sound conditions were also included. The auditory signals were task-irrelevant and presented in spatial proximity to, but not fully overlapping with, the visual stimuli. Participants had to discriminate the direction of the visual global motion.Results and discussionAfter accounting for or eliminating the bias induced by auditory motion at the decisional level, the thresholds of visual motion perception were found to be similar across the four auditory conditions. Further analysis using the equivalent noise model confirmed that auditory motion did not influence the detection or pooling of visual motion signals. Hence, we did not find evidence to support the notion that auditory motion modulates the sensory or perceptual processing of visual global motion, delineating a boundary condition for such crossmodal interactions.