🤖 AI Summary
This study investigates the statistical conditions in linear input that support the acquisition of hierarchical linguistic structure, with a focus on the role of function words. Through systematic analysis across 186 languages, the authors examine three key properties of function words: their high frequency, their association with syntactic structure, and their alignment with phrase boundaries. Combining neural network modeling, counterfactual language experiments, ablation studies, and probing analyses, the work demonstrates for the first time that these three properties jointly facilitate hierarchical structure learning, with frequency and structural association contributing most significantly. The research further reveals that distinct learning pathways can yield comparable performance while exhibiting markedly different internal mechanisms, underscoring the critical role of function words in cross-linguistic structural induction.
📝 Abstract
What statistical conditions support learning hierarchical structure from linear input? In this paper, we address this question by focusing on the statistical distribution of function words. Function words have long been argued to play a crucial role in language acquisition due to their distinctive distributional properties, including high frequency, reliable association with syntactic structure, and alignment with phrase boundaries. We use cross-linguistic corpus analysis to first establish that all three properties are present across 186 studied languages. Next, we use a combination of counterfactual language modeling and ablation experiments to show that language variants preserving all three properties are more easily acquired by neural learners, with frequency and structural association contributing more strongly than boundary alignment. Follow-up probing and ablation analyses further reveal that different learning conditions lead to systematically different reliance on function words, indicating that similar performance can arise from distinct internal mechanisms.