ChatPaper.aiChatPaper

On the Acquisition of Shared Grammatical Representations in Bilingual Language Models

March 5, 2025
Authors: Catherine Arnett, Tyler A. Chang, James A. Michaelov, Benjamin K. Bergen
cs.AI

Abstract

While crosslingual transfer is crucial to contemporary language models' multilingual capabilities, how it occurs is not well understood. In this paper, we ask what happens to a monolingual language model when it begins to be trained on a second language. Specifically, we train small bilingual models for which we control the amount of data for each language and the order of language exposure. To find evidence of shared multilingual representations, we turn to structural priming, a method used to study grammatical representations in humans. We first replicate previous crosslingual structural priming results and find that after controlling for training data quantity and language exposure, there are asymmetrical effects across language pairs and directions. We argue that this asymmetry may shape hypotheses about human structural priming effects. We also find that structural priming effects are less robust for less similar language pairs, highlighting potential limitations of crosslingual transfer learning and shared representations for typologically diverse languages.

Summary

AI-Generated Summary

PDF31March 7, 2025