๐ค AI Summary
This work addresses the limited cross-lingual and morphological rule generalization of Transformers in inflectional morphology tasks, investigating their implicit modeling of phonological and subphonemic featuresโsuch as final devoicing and vowel harmony. We propose a language-agnostic phonological probing framework, applying layer-wise linear probes and controlled ablation experiments across seven morphologically rich languages using the UniMorph dataset. Our analysis systematically disentangles the representational roles of phoneme embeddings versus encoder layers in capturing local phonological phenomena versus long-range phonological dependencies. We find, for the first time, that local phonological features (e.g., Turkish final devoicing) are predominantly encoded in the phoneme embedding layer, whereas long-range constraints (e.g., vowel harmony) are principally captured by deeper encoder layers. This division of labor provides both a novel training paradigm and theoretical foundation for developing more phonologically aware and generalizable morphological models.
๐ Abstract
Transformers have achieved state-of-the-art performance in morphological inflection tasks, yet their ability to generalize across languages and morphological rules remains limited. One possible explanation for this behavior can be the degree to which these models are able to capture implicit phenomena at the phonological and subphonemic levels. We introduce a language-agnostic probing method to investigate phonological feature encoding in transformers trained directly on phonemes, and perform it across seven morphologically diverse languages. We show that phonological features which are local, such as final-obstruent devoicing in Turkish, are captured well in phoneme embeddings, whereas long-distance dependencies like vowel harmony are better represented in the transformer's encoder. Finally, we discuss how these findings inform empirical strategies for training morphological models, particularly regarding the role of subphonemic feature acquisition.