AUTHOR=Jágrová Klára , Hedderich Michael , Mosbach Marius , Avgustinova Tania , Klakow Dietrich TITLE=On the Correlation of Context-Aware Language Models With the Intelligibility of Polish Target Words to Czech Readers JOURNAL=Frontiers in Psychology VOLUME=Volume 12 - 2021 YEAR=2021 URL=https://www.frontiersin.org/journals/psychology/articles/10.3389/fpsyg.2021.662277 DOI=10.3389/fpsyg.2021.662277 ISSN=1664-1078 ABSTRACT=This contribution seeks to provide a rational probabilistic explanation for the intelligibility of words in a genetically related language that is unknown to the reader – a phenomenon referred to as intercomprehension. In this research domain, linguistic distance, among other factors, was proved to correlate well with mutual intelligibility of individual words. However, the role of context for the intelligibility of target words in sentences was subject to very few studies. To address this, here we analyze data from web-based experiments in which Czech respondents were asked to translate highly predictable target words at the final position of Polish sentences. We compare correlations of target word intelligibility with data from 3-gram language models to their correlations with data obtained from context-aware language models. More specifically, we evaluate two context-aware language model architectures: LSTMs that can, theoretically, take infinitely long distance dependencies into account and Transformer-based language models which are able to access the whole input sequence at the same time. We investigate how their use of context affects surprisal and its correlation with intelligibility.