AUTHOR=Noh Kangsan , Oh Eunjeong , Song Sanghoun TITLE=Testing language models’ syntactic sensitivity to grammatical constraints: a case study of wanna contraction JOURNAL=Frontiers in Communication VOLUME=Volume 9 - 2024 YEAR=2024 URL=https://www.frontiersin.org/journals/communication/articles/10.3389/fcomm.2024.1442093 DOI=10.3389/fcomm.2024.1442093 ISSN=2297-900X ABSTRACT=Wanna contraction refers to the reduction of want to to wanna. Interestingly, native English speakers contract want to in object extraction questions but not in subject extraction questions. The present study investigated whether language models such as bidirectional encoder representations from transformers (BERT) adhere to this grammatical subtlety. Wanna contraction involves two factors: subject–object asymmetry and contraction. Disentangling these two ensures that when language models accurately identify illicit instances of wanna contraction, the detection stems from their understanding of the contraction, rather than the intervention by subject–object asymmetry. For this objective, we conducted three independent experiments. We tested whether language models detect illicit cases of contraction by maintaining constant contraction (Experiment 1) and question types (Experiment 2). We predicted that higher surprisal values would be assigned to ungrammatical instances. The overall results of the two experiments were in line with our prediction (87.5 and 75%, respectively). In addition, the analysis of by-word surprisal also indicates that the models generate higher surprisal values for subject extraction questions in illicit wanna instances (Experiment 3). Thus, the models’ processing patterns of wanna contraction turn out to be close to those of native English speakers, suggesting their role as a research tool in linguistic experiments.