Probing Subphonemes in Morphology Models

๐Ÿ“… 2025-05-16
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This work addresses the limited cross-lingual and morphological rule generalization of Transformers in inflectional morphology tasks, investigating their implicit modeling of phonological and subphonemic featuresโ€”such as final devoicing and vowel harmony. We propose a language-agnostic phonological probing framework, applying layer-wise linear probes and controlled ablation experiments across seven morphologically rich languages using the UniMorph dataset. Our analysis systematically disentangles the representational roles of phoneme embeddings versus encoder layers in capturing local phonological phenomena versus long-range phonological dependencies. We find, for the first time, that local phonological features (e.g., Turkish final devoicing) are predominantly encoded in the phoneme embedding layer, whereas long-range constraints (e.g., vowel harmony) are principally captured by deeper encoder layers. This division of labor provides both a novel training paradigm and theoretical foundation for developing more phonologically aware and generalizable morphological models.

Technology Category

Application Category

๐Ÿ“ Abstract
Transformers have achieved state-of-the-art performance in morphological inflection tasks, yet their ability to generalize across languages and morphological rules remains limited. One possible explanation for this behavior can be the degree to which these models are able to capture implicit phenomena at the phonological and subphonemic levels. We introduce a language-agnostic probing method to investigate phonological feature encoding in transformers trained directly on phonemes, and perform it across seven morphologically diverse languages. We show that phonological features which are local, such as final-obstruent devoicing in Turkish, are captured well in phoneme embeddings, whereas long-distance dependencies like vowel harmony are better represented in the transformer's encoder. Finally, we discuss how these findings inform empirical strategies for training morphological models, particularly regarding the role of subphonemic feature acquisition.
Problem

Research questions and friction points this paper is trying to address.

Investigating phonological feature encoding in transformers for morphology tasks
Assessing model performance on local versus long-distance phonological phenomena
Exploring subphonemic feature acquisition in morphological model training strategies
Innovation

Methods, ideas, or system contributions that make the work stand out.

Language-agnostic probing method for phonemes
Analyzes local and long-distance phonological features
Informs training strategies for morphological models
๐Ÿ”Ž Similar Papers
No similar papers found.