%A Higgen,Focko L. %A Ruppel,Philipp %A Görner,Michael %A Kerzel,Matthias %A Hendrich,Norman %A Feldheim,Jan %A Wermter,Stefan %A Zhang,Jianwei %A Gerloff,Christian %D 2020 %J Frontiers in Robotics and AI %C %F %G English %K Aging,Braille,multisensory,neural networks,ANN %Q %R 10.3389/frobt.2020.540565 %W %L %M %P %7 %8 2020-December-23 %9 Original Research %# %! Crossmodal integration in humans and robots %* %< %T Crossmodal Pattern Discrimination in Humans and Robots: A Visuo-Tactile Case Study %U https://www.frontiersin.org/articles/10.3389/frobt.2020.540565 %V 7 %0 JOURNAL ARTICLE %@ 2296-9144 %X The quality of crossmodal perception hinges on two factors: The accuracy of the independent unimodal perception and the ability to integrate information from different sensory systems. In humans, the ability for cognitively demanding crossmodal perception diminishes from young to old age. Here, we propose a new approach to research to which degree the different factors contribute to crossmodal processing and the age-related decline by replicating a medical study on visuo-tactile crossmodal pattern discrimination utilizing state-of-the-art tactile sensing technology and artificial neural networks (ANN). We implemented two ANN models to specifically focus on the relevance of early integration of sensory information during the crossmodal processing stream as a mechanism proposed for efficient processing in the human brain. Applying an adaptive staircase procedure, we approached comparable unimodal classification performance for both modalities in the human participants as well as the ANN. This allowed us to compare crossmodal performance between and within the systems, independent of the underlying unimodal processes. Our data show that unimodal classification accuracies of the tactile sensing technology are comparable to humans. For crossmodal discrimination of the ANN the integration of high-level unimodal features on earlier stages of the crossmodal processing stream shows higher accuracies compared to the late integration of independent unimodal classifications. In comparison to humans, the ANN show higher accuracies than older participants in the unimodal as well as the crossmodal condition, but lower accuracies than younger participants in the crossmodal task. Taken together, we can show that state-of-the-art tactile sensing technology is able to perform a complex tactile recognition task at levels comparable to humans. For crossmodal processing, human inspired early sensory integration seems to improve the performance of artificial neural networks. Still, younger participants seem to employ more efficient crossmodal integration mechanisms than modeled in the proposed ANN. Our work demonstrates how collaborative research in neuroscience and embodied artificial neurocognitive models can help to derive models to inform the design of future neurocomputational architectures.